Hotswap Agent / Wildfly / Weld: Exception when reloading: BeanManagerImpl.contexts not accessible - wildfly

I'm using WildFly, and have properly configured Hotswap Agent 1.0 as far as I can see. It's starting up nicely with the message:
HOTSWAP AGENT: 20:01:19.893 INFO (org.hotswap.agent.HotswapAgent) - Loading Hotswap agent {1.0} - unlimited runtime class redefinition.
HOTSWAP AGENT: 20:01:21.096 INFO (org.hotswap.agent.config.PluginRegistry) - Discovered plugins: [Hotswapper, WatchResources, AnonymousClassPatch, ClassInitPlugin, Hibernate, Hibernate3JPA, Hibernate3, Spring, Jersey1, Jersey2, Jetty, Tomcat, ZK, Logback, Log4j2, MyFaces, Mojarra, Seam, ELResolver, WildFlyELResolver, OsgiEquinox, Proxy, WebObjects, Weld, JBossModules, ResteasyRegistry, Deltaspike, JavaBeans]
20:01:22,363 INFO [org.jboss.modules] (main) JBoss Modules version 1.5.1.Final
However every time I modify a class, I get the following WARNING in the console log. Modified classes are not properly reloaded.
19:56:07,946 INFO [stdout] (Thread-281) HOTSWAP AGENT: 19:56:07.946 WARNING (org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent) - BeanManagerImpl.contexts not accessible
19:56:07,947 INFO [stdout] (Thread-281) java.lang.NoSuchFieldException: contexts
19:56:07,948 INFO [stdout] (Thread-281) at java.lang.Class.getField(Class.java:1703)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.getContexts(BeanDeploymentArchiveAgent.java:269)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadManagedBeanInContexts(BeanDeploymentArchiveAgent.java:318)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadManagedBean(BeanDeploymentArchiveAgent.java:297)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadBean(BeanDeploymentArchiveAgent.java:231)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.refreshBeanClass(BeanDeploymentArchiveAgent.java:192)
19:56:07,948 INFO [stdout] (Thread-281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19:56:07,949 INFO [stdout] (Thread-281) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19:56:07,949 INFO [stdout] (Thread-281) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19:56:07,949 INFO [stdout] (Thread-281) at java.lang.reflect.Method.invoke(Method.java:497)
19:56:07,949 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanClassRefreshCommand.executeCommand(BeanClassRefreshCommand.java:93)
19:56:07,949 INFO [stdout] (Thread-281) at org.hotswap.agent.command.impl.CommandExecutor.run(CommandExecutor.java:25)
19:56:07,950 INFO [stdout] (Thread-281)
19:56:08,030 INFO [stdout] (Thread-282) HOTSWAP AGENT: 19:56:08.030 INFO (org.hotswap.agent.plugin.wildfly.el.PurgeWildFlyBeanELResolverCacheCommand) - Cleaning BeanPropertiesCache lgc.mysynap.ui.components.SourcePopup ModuleClassLoader for Module "deployment.mysynap-0.1.war:main" from Service Module Loader.
Any idea what could be the cause?

I had the same problem, after disabling all breakpoint in the running time, it will work.
then you need to activate breakpoints again.

Related

ERROR SparkContext: Error initializing SparkContext locally

I'm new to everything related to java/scala/maven/sbt/spark, so please bear with me.
Managed to get everything up and running, or at least so it seems when running the spark-shell locally. The SparkContext gets initialized properly and I can create RDDs.
However, when I call spark-submit locally, I get a SparkContext error.
%SPARK_HOME%\bin\spark-submit --master local --class example.bd.MyApp target\scala-2.12\sparkapp_2.12-1.5.5.jar
This is my code.
package example.bd
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object MyApp {
def main(args : Array[String]) {
val conf = new SparkConf().setAppName("My first Spark application")
val sc = new SparkContext(conf)
println("hi everyone")
}
}
These are my error logs:
21/10/24 18:38:45 WARN Shell: Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/10/24 18:38:45 INFO SparkContext: Running Spark version 3.1.2
21/10/24 18:38:45 INFO ResourceUtils: ==============================================================
21/10/24 18:38:45 INFO ResourceUtils: No custom resources configured for spark.driver.
21/10/24 18:38:45 INFO ResourceUtils: ==============================================================
21/10/24 18:38:45 INFO SparkContext: Submitted application: My first Spark application
21/10/24 18:38:45 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
21/10/24 18:38:45 INFO ResourceProfile: Limiting resource is cpu
21/10/24 18:38:45 INFO ResourceProfileManager: Added ResourceProfile id: 0
21/10/24 18:38:45 INFO SecurityManager: Changing view acls to: User
21/10/24 18:38:45 INFO SecurityManager: Changing modify acls to: User
21/10/24 18:38:45 INFO SecurityManager: Changing view acls groups to:
21/10/24 18:38:45 INFO SecurityManager: Changing modify acls groups to:
21/10/24 18:38:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(User); groups with view permissions: Set(); users with modify permissions: Set(User); groups with modify permissions: Set()
21/10/24 18:38:46 INFO Utils: Successfully started service 'sparkDriver' on port 56899.
21/10/24 18:38:46 INFO SparkEnv: Registering MapOutputTracker
21/10/24 18:38:46 INFO SparkEnv: Registering BlockManagerMaster
21/10/24 18:38:46 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/10/24 18:38:46 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/10/24 18:38:46 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/10/24 18:38:46 INFO DiskBlockManager: Created local directory at C:\Users\User\AppData\Local\Temp\blockmgr-8df572ec-4206-48ae-b517-bd56242fca4c
21/10/24 18:38:46 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
21/10/24 18:38:46 INFO SparkEnv: Registering OutputCommitCoordinator
21/10/24 18:38:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/10/24 18:38:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://LAPTOP-RKJJV5E4:4040
21/10/24 18:38:46 INFO SparkContext: Added JAR file:/C:/[path]/sparkapp/target/scala-2.12/sparkapp_2.12-1.5.5.jar at spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar with timestamp 1635093525663
21/10/24 18:38:46 INFO Executor: Starting executor ID driver on host LAPTOP-RKJJV5E4
21/10/24 18:38:46 INFO Executor: Fetching spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar with timestamp 1635093525663
21/10/24 18:38:46 INFO TransportClientFactory: Successfully created connection to LAPTOP-RKJJV5E4/192.168.0.175:56899 after 30 ms (0 ms spent in bootstraps)
21/10/24 18:38:46 INFO Utils: Fetching spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar to C:\Users\User\AppData\Local\Temp\spark-9c13f31f-92a7-4fc5-a87d-a0ae6410e6d2\userFiles-5ac95e31-3656-4a4d-a205-e0750c041bcb\fetchFileTemp7994667570718611461.tmp
21/10/24 18:38:46 ERROR SparkContext: Error initializing SparkContext.
java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:736)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:271)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1120)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1106)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:563)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:953)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:945)
at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:945)
at org.apache.spark.executor.Executor.<init>(Executor.scala:247)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
... 6 more
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:46 INFO SparkUI: Stopped Spark web UI at http://LAPTOP-RKJJV5E4:4040
21/10/24 18:38:46 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:881)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2370)
at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2069)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1419)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2069)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:671)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/10/24 18:38:46 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/24 18:38:46 INFO MemoryStore: MemoryStore cleared
21/10/24 18:38:46 INFO BlockManager: BlockManager stopped
21/10/24 18:38:46 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/24 18:38:46 WARN MetricsSystem: Stopping a MetricsSystem that is not running
21/10/24 18:38:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/24 18:38:46 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:736)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:271)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1120)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1106)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:563)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:953)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:945)
at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:945)
at org.apache.spark.executor.Executor.<init>(Executor.scala:247)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
... 6 more
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:47 ERROR Utils: Uncaught exception in thread shutdown-hook-0
java.lang.NullPointerException
at org.apache.spark.executor.Executor.$anonfun$stop$3(Executor.scala:332)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:222)
at org.apache.spark.executor.Executor.stop(Executor.scala:332)
at org.apache.spark.executor.Executor.$anonfun$new$2(Executor.scala:76)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/10/24 18:38:47 INFO ShutdownHookManager: Shutdown hook called
21/10/24 18:38:47 INFO ShutdownHookManager: Deleting directory C:\Users\User\AppData\Local\Temp\spark-ac076276-e6ab-4cfb-a153-56ad35c46d56
21/10/24 18:38:47 INFO ShutdownHookManager: Deleting directory C:\Users\User\AppData\Local\Temp\spark-9c13f31f-92a7-4fc5-a87d-a0ae6410e6d2
As far as I can tell the errors are directly related to Hadoop not being set up, but that shouldn't be an issue given that I am trying to run it locally, no?
Any help will be greatly appreciated.

Increase proxy time out in wildfly server

How can i increase the proxy time out in wildfly server version 16 ?
I have tried many ways by changing the standalone.xml file but it did not work.
23:46:08,144 INFO [stdout] (default task-6) org.springframework.web.client.HttpServerErrorException$BadGateway: 502 Proxy Error
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.HttpServerErrorException.create(HttpServerErrorException.java:83) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:124) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:102) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.ResponseErrorHandler.handleError(ResponseErrorHandler.java:63) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.handleResponse(RestTemplate.java:777) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:735) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:669) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:578) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.serviceclient.RestClient.processRestCall(RestClient.java:46) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.serviceclient.EswitchClient.processRequest(EswitchClient.java:86) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserActiveCreditCardList(BillerMgtServiceImpl.java:259) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserCardDtoList(BillerMgtServiceImpl.java:241) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserBillerList(BillerMgtServiceImpl.java:209) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl$$FastClassBySpringCGLIB$$c2d245c9.invoke(<generated>) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:56) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:93) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl$$EnhancerBySpringCGLIB$$da6b8f8.getUserBillerList(<generated>) ~[IB-1.00.jar:?]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController.postGetUserBillerList(BillerMgtRestController.java:82) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController$$FastClassBySpringCGLIB$$7e2bd8a4.invoke(<generated>) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:56) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:93) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController$$EnhancerBySpringCGLIB$$ddfae7a9.postGetUserBillerList(<generated>) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_211]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:189) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:102) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:800) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1038) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:908) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at javax.servlet.http.HttpServlet.service(HttpServlet.java:706) ~[jboss-servlet-api_4.0_spec-1.0.0.Final.jar!/:1.0.0.Final]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at javax.servlet.http.HttpServlet.service(HttpServlet.java:791) ~[jboss-servlet-api_4.0_spec-1.0.0.Final.jar!/:1.0.0.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78) ~[?:?]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:132) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:292) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:81) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:138) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.SecurityContextThreadSetupAction.lambda$create$0(SecurityContextThreadSetupAction.java:105) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:272) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:81) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:104) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.Connectors.executeRootHandler(Connectors.java:364) [undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830) [undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.ContextClassLoaderSavingRunnable.run(ContextClassLoaderSavingRunnable.java:35) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor.safeRun(EnhancedQueueExecutor.java:1982) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.doRunTask(EnhancedQueueExecutor.java:1486) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1348) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,154 INFO [stdout] (default task-6) at java.lang.Thread.run(Thread.java:748) [?:1.8.0_211]
I get the 502 bad gateway error.I want to increase the proxy time out in wildfly server.Is it possible to increase proxy timeout in wildfly server.
Can anyone explain how it is done. Thanks in advance.

Spark - Actor not found for: ActorSelection

I just cloned the master repository of Spark from Github. I am running it on OSX 10.9, Spark 1.4.1 and Scala 2.10.4
I just tried to run the SparkPi example program using IntelliJ Idea but get the error : akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster#myhost:7077/)
I did checkout a similar post at the mailing list but found no solution.
Find the complete stack trace below. Any help would be really appreciated.
2015-07-28 22:16:45,888 INFO [main] spark.SparkContext (Logging.scala:logInfo(59)) - Running Spark version 1.5.0-SNAPSHOT
2015-07-28 22:16:47,125 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-07-28 22:16:47,753 INFO [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing view acls to: mac
2015-07-28 22:16:47,755 INFO [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing modify acls to: mac
2015-07-28 22:16:47,756 INFO [main] spark.SecurityManager (Logging.scala:logInfo(59)) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mac); users with modify permissions: Set(mac)
2015-07-28 22:16:49,454 INFO [sparkDriver-akka.actor.default-dispatcher-2] slf4j.Slf4jLogger (Slf4jLogger.scala:applyOrElse(80)) - Slf4jLogger started
2015-07-28 22:16:49,695 INFO [sparkDriver-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Starting remoting
2015-07-28 22:16:50,167 INFO [sparkDriver-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Remoting started; listening on addresses :[akka.tcp://sparkDriver#192.168.2.105:49981]
2015-07-28 22:16:50,215 INFO [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'sparkDriver' on port 49981.
2015-07-28 22:16:50,372 INFO [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering MapOutputTracker
2015-07-28 22:16:50,596 INFO [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering BlockManagerMaster
2015-07-28 22:16:50,948 INFO [main] storage.DiskBlockManager (Logging.scala:logInfo(59)) - Created local directory at /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/blockmgr-309db4d1-d129-43e5-a90e-12cf51ad491f
2015-07-28 22:16:51,198 INFO [main] storage.MemoryStore (Logging.scala:logInfo(59)) - MemoryStore started with capacity 491.7 MB
2015-07-28 22:16:51,707 INFO [main] spark.HttpFileServer (Logging.scala:logInfo(59)) - HTTP File server directory is /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/spark-f28e24e7-b798-4365-8209-409d8b27ad2f/httpd-ce32c41d-b618-49e9-bec1-f409454f3679
2015-07-28 22:16:51,777 INFO [main] spark.HttpServer (Logging.scala:logInfo(59)) - Starting HTTP Server
2015-07-28 22:16:52,091 INFO [main] server.Server (Server.java:doStart(272)) - jetty-8.1.14.v20131031
2015-07-28 22:16:52,116 INFO [main] server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started SocketConnector#0.0.0.0:49982
2015-07-28 22:16:52,116 INFO [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'HTTP file server' on port 49982.
2015-07-28 22:16:52,249 INFO [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering OutputCommitCoordinator
2015-07-28 22:16:54,253 INFO [main] server.Server (Server.java:doStart(272)) - jetty-8.1.14.v20131031
2015-07-28 22:16:54,315 INFO [main] server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started SelectChannelConnector#0.0.0.0:4040
2015-07-28 22:16:54,317 INFO [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'SparkUI' on port 4040.
2015-07-28 22:16:54,386 INFO [main] ui.SparkUI (Logging.scala:logInfo(59)) - Started SparkUI at http://192.168.2.105:4040
2015-07-28 22:16:54,924 WARN [main] metrics.MetricsSystem (Logging.scala:logWarning(71)) - Using default name DAGScheduler for source because spark.app.id is not set.
2015-07-28 22:16:55,132 INFO [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077...
2015-07-28 22:16:55,392 WARN [sparkDriver-akka.actor.default-dispatcher-14] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster#myhost:7077]
2015-07-28 22:16:55,412 WARN [sparkDriver-akka.actor.default-dispatcher-14] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster#myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster#myhost:7077]] Caused by: [myhost: unknown error]
2015-07-28 22:16:55,447 WARN [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster#myhost:7077/), Path(/user/Master)]
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266)
at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533)
at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569)
at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559)
at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
at akka.remote.EndpointWriter.postStop(Endpoint.scala:557)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:477)
at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
2015-07-28 22:17:15,459 INFO [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077...
2015-07-28 22:17:15,463 WARN [sparkDriver-akka.actor.default-dispatcher-14] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster#myhost:7077]
2015-07-28 22:17:15,464 WARN [sparkDriver-akka.actor.default-dispatcher-2] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster#myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster#myhost:7077]] Caused by: [myhost: unknown error]
2015-07-28 22:17:15,464 WARN [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster#myhost:7077/), Path(/user/Master)]
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266)
at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533)
at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569)
at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559)
at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
at akka.remote.EndpointWriter.postStop(Endpoint.scala:557)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:477)
at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
2015-07-28 22:17:35,136 INFO [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077...
2015-07-28 22:17:35,141 WARN [sparkDriver-akka.actor.default-dispatcher-13] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster#myhost:7077]
2015-07-28 22:17:35,142 WARN [sparkDriver-akka.actor.default-dispatcher-13] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster#myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster#myhost:7077]] Caused by: [myhost: unknown error]
2015-07-28 22:17:35,142 WARN [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster#myhost:7077/), Path(/user/Master)]
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266)
at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533)
at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569)
at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559)
at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
at akka.remote.EndpointWriter.postStop(Endpoint.scala:557)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:477)
at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
2015-07-28 22:17:35,462 INFO [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077...
2015-07-28 22:17:35,464 WARN [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster#myhost:7077/), Path(/user/Master)]
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266)
at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533)
at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569)
at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559)
at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
at akka.remote.ReliableDeliverySupervisor$$anonfun$gated$1.applyOrElse(Endpoint.scala:335)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.remote.ReliableDeliverySupervisor.aroundReceive(Endpoint.scala:188)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
2015-07-28 22:17:55,135 INFO [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077...
2015-07-28 22:17:55,140 WARN [sparkDriver-akka.actor.default-dispatcher-19] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster#myhost:7077]
2015-07-28 22:17:55,140 WARN [sparkDriver-akka.actor.default-dispatcher-3] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster#myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster#myhost:7077]] Caused by: [myhost: unknown error]
2015-07-28 22:17:55,178 ERROR [appclient-registration-retry-thread] util.SparkUncaughtExceptionHandler (Logging.scala:logError(96)) - Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main]
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask#3db0c61c rejected from java.util.concurrent.ThreadPoolExecutor#33773fda[Running, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 4]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:95)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint.tryRegisterAllMasters(AppClient.scala:95)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint.org$apache$spark$deploy$client$AppClient$ClientEndpoint$$registerWithMaster(AppClient.scala:121)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:132)
at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1218)
at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:124)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2015-07-28 22:17:55,224 INFO [Thread-0] storage.DiskBlockManager (Logging.scala:logInfo(59)) - Shutdown hook called
2015-07-28 22:17:55,241 INFO [Thread-0] util.Utils (Logging.scala:logInfo(59)) - Shutdown hook called
2015-07-28 22:17:55,243 INFO [Thread-0] util.Utils (Logging.scala:logInfo(59)) - Deleting directory /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/spark-f28e24e7-b798-4365-8209-409d8b27ad2f/userFiles-5ccb1927-1499-4deb-b4b2-92a24d8ab7a3
The problem was that I was trying to start the example app in standalone cluster mode by passing in
-Dspark.master=spark://myhost:7077
as an argument to the JVM. I launched the example app locally using
-Dspark.master=local
and it worked.
I know this is an old question ,
just in case, for users come here after installing spark chart on Kubernetis cluster :
after chart installation open Spark UI on localhost:8080
figure out spark master name , for example : Spark Master at spark://newbie-cricket-master:7077
then on master cmd /bin/spark-shell --master spark://newbie-cricket-master:7077

Maps Pro working in Jasper Studio but not on IBM WAS Server

I installed the TIBCO Jaspersoft Studio Professional - Visual Designer for JasperReports verion 6.0.1 30 Day Trial. I was able to create simple bar and pie chart and after compiling the jrxml into jasper i was also able to deploy them on IBM WAS Server, all of them worked like charm. But when i created chart using Maps Pro i was able to view it correctly in Preview but when i compiled it to jasper and deployed it to IBM WAS server it didn't worked.
In my view following was root cause:
Caused by: java.lang.ClassNotFoundException: com.jaspersoft.jasperreports.fusion.maps.StandardMapComponent from [Module "deployment.siperian-mrm.ear.zds-gui.war:main" from Service Module Loader].
Does in Trial version this doesn't work or i am doing something wrong, or need some classpath settings?
Following is the detail log for the error i am getting in server:
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
com.siperian.dsapp.domain.common.DomainLevelException: EXCEPTION||An
unexpected error occurred. 18:17:55,856 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
com.siperian.dsapp.mde.domain.report.MDEReportService.generateReport(MDEReportService.java:70)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.mde.jsf.server.report.ReportHandler.handleRequest(ReportHandler.java:61)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.context.support.HttpRequestHandlerServlet.service(HttpRequestHandlerServlet.java:63)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:295)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.jsf.server.security.AbstractSecurityFilter.doFilter(AbstractSecurityFilter.java:210)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.jsf.server.security.StandardSecurityFilter.doFilter(StandardSecurityFilter.java:112)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:236)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:231)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:149)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.as.web.security.SecurityContextAssociationValve.invoke(SecurityContextAssociationValve.java:169)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:145)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:97)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:102)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:344)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:856)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:653)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:926)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at java.lang.Thread.run(Thread.java:745) 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
com.siperian.dsapp.datasource.report.DBReportServiceException:
net.sf.jasperreports.engine.JRException: Class not found when loading
object from InputStream 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
com.siperian.dsapp.datasource.report.DBReportService.generateReport(DBReportService.java:34)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.mde.domain.report.MDEReportService.generateReport(MDEReportService.java:68)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
... 22 more 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
net.sf.jasperreports.engine.JRException: Class not found when loading
object from InputStream 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:253)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:229)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.JasperFillManager.fill(JasperFillManager.java:405)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillManager.java:824)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.datasource.report.DBReportService.getJasperPrint(DBReportService.java:40)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.datasource.report.DBReportService.generateReport(DBReportService.java:25)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
... 23 more 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
java.lang.ClassNotFoundException:
com.jaspersoft.jasperreports.fusion.maps.StandardMapComponent from
[Module "deployment.siperian-mrm.ear.zds-gui.war:main" from Service
Module Loader] 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:213)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:459)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClassChecked(ConcurrentClassLoader.java:408)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:389)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:134)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
[2015-03-04 18:17:55,846] [http-/192.168.1.14:8080-5] [ERROR]
com.siperian.dsapp.mde.jsf.server.report.ReportHandler: SIP-49000
Following is the header in my jrxml:
<jasperReport xmlns="http://jasperreports.sourceforge.net/jasperreports" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://jasperreports.sourceforge.net/jasperreports http://jasperreports.sourceforge.net/xsd/jasperreport.xsd" name="SupplierStateWiseMap" pageWidth="900" pageHeight="250" orientation="Landscape" columnWidth="900" leftMargin="0" rightMargin="0" topMargin="0" bottomMargin="0" uuid="88aacb86-d339-42d7-9586-69cebc5e763f">
Any help would be highly appreciated
to me it appears related to missing required libraries for JasperReports PRO components. Fusion Charts/Widgets/Maps are with HTML5 charts part of the Jaspersoft Professional package.
I'm not sure how it is configured your server, but you should check to have the related jars + license information (even if trial) + jasperreports properties set.
Did you tried to test your reports into a test trial installation of JasperReports Server PRO ?
Regards,
Massimo.

understanding wildfly hornetq console output

i need help understanding hornetq console output. I am running jms in wildfly 8.1 and in the logs i see this
INFO [stdout] (Thread-1049 (HornetQ-client-global-threads-564338750))
INFO [stdout] (Thread-1050 (HornetQ-client-global-threads-564338750))
INFO [stdout] (Thread-1051 (HornetQ-client-global-threads-564338750))
is the string 'Thread-1050' means that this is the 1050th thread created