Increase proxy time out in wildfly server - jboss

How can i increase the proxy time out in wildfly server version 16 ?
I have tried many ways by changing the standalone.xml file but it did not work.
23:46:08,144 INFO [stdout] (default task-6) org.springframework.web.client.HttpServerErrorException$BadGateway: 502 Proxy Error
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.HttpServerErrorException.create(HttpServerErrorException.java:83) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:124) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:102) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.ResponseErrorHandler.handleError(ResponseErrorHandler.java:63) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.handleResponse(RestTemplate.java:777) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:735) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:669) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:578) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.serviceclient.RestClient.processRestCall(RestClient.java:46) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.serviceclient.EswitchClient.processRequest(EswitchClient.java:86) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserActiveCreditCardList(BillerMgtServiceImpl.java:259) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserCardDtoList(BillerMgtServiceImpl.java:241) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl.getUserBillerList(BillerMgtServiceImpl.java:209) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl$$FastClassBySpringCGLIB$$c2d245c9.invoke(<generated>) ~[IB-1.00.jar:?]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:56) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,144 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:93) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib.service.billermgt.impl.BillerMgtServiceImpl$$EnhancerBySpringCGLIB$$da6b8f8.getUserBillerList(<generated>) ~[IB-1.00.jar:?]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController.postGetUserBillerList(BillerMgtRestController.java:82) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController$$FastClassBySpringCGLIB$$7e2bd8a4.invoke(<generated>) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:56) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:55) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:93) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,149 INFO [stdout] (default task-6) at com.epic.edbs.ib_rest.api.billermgt.BillerMgtRestController$$EnhancerBySpringCGLIB$$ddfae7a9.postGetUserBillerList(<generated>) ~[classes:?]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_211]
23:46:08,149 INFO [stdout] (default task-6) at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_211]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:189) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) ~[spring-web-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:102) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:800) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1038) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:908) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at javax.servlet.http.HttpServlet.service(HttpServlet.java:706) ~[jboss-servlet-api_4.0_spec-1.0.0.Final.jar!/:1.0.0.Final]
23:46:08,150 INFO [stdout] (default task-6) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882) ~[spring-webmvc-5.1.3.RELEASE.jar:5.1.3.RELEASE]
23:46:08,150 INFO [stdout] (default task-6) at javax.servlet.http.HttpServlet.service(HttpServlet.java:791) ~[jboss-servlet-api_4.0_spec-1.0.0.Final.jar!/:1.0.0.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78) ~[?:?]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:132) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,150 INFO [stdout] (default task-6) at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) ~[undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:292) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:81) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:138) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43) ~[undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.security.SecurityContextThreadSetupAction.lambda$create$0(SecurityContextThreadSetupAction.java:105) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1502) ~[?:?]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:272) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:81) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:104) [undertow-servlet-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.Connectors.executeRootHandler(Connectors.java:364) [undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830) [undertow-core-2.0.19.Final.jar!/:2.0.19.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.ContextClassLoaderSavingRunnable.run(ContextClassLoaderSavingRunnable.java:35) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor.safeRun(EnhancedQueueExecutor.java:1982) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.doRunTask(EnhancedQueueExecutor.java:1486) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,152 INFO [stdout] (default task-6) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1348) [jboss-threads-2.3.3.Final.jar!/:2.3.3.Final]
23:46:08,154 INFO [stdout] (default task-6) at java.lang.Thread.run(Thread.java:748) [?:1.8.0_211]
I get the 502 bad gateway error.I want to increase the proxy time out in wildfly server.Is it possible to increase proxy timeout in wildfly server.
Can anyone explain how it is done. Thanks in advance.

Related

ERROR SparkContext: Error initializing SparkContext locally

I'm new to everything related to java/scala/maven/sbt/spark, so please bear with me.
Managed to get everything up and running, or at least so it seems when running the spark-shell locally. The SparkContext gets initialized properly and I can create RDDs.
However, when I call spark-submit locally, I get a SparkContext error.
%SPARK_HOME%\bin\spark-submit --master local --class example.bd.MyApp target\scala-2.12\sparkapp_2.12-1.5.5.jar
This is my code.
package example.bd
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object MyApp {
def main(args : Array[String]) {
val conf = new SparkConf().setAppName("My first Spark application")
val sc = new SparkContext(conf)
println("hi everyone")
}
}
These are my error logs:
21/10/24 18:38:45 WARN Shell: Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/10/24 18:38:45 INFO SparkContext: Running Spark version 3.1.2
21/10/24 18:38:45 INFO ResourceUtils: ==============================================================
21/10/24 18:38:45 INFO ResourceUtils: No custom resources configured for spark.driver.
21/10/24 18:38:45 INFO ResourceUtils: ==============================================================
21/10/24 18:38:45 INFO SparkContext: Submitted application: My first Spark application
21/10/24 18:38:45 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
21/10/24 18:38:45 INFO ResourceProfile: Limiting resource is cpu
21/10/24 18:38:45 INFO ResourceProfileManager: Added ResourceProfile id: 0
21/10/24 18:38:45 INFO SecurityManager: Changing view acls to: User
21/10/24 18:38:45 INFO SecurityManager: Changing modify acls to: User
21/10/24 18:38:45 INFO SecurityManager: Changing view acls groups to:
21/10/24 18:38:45 INFO SecurityManager: Changing modify acls groups to:
21/10/24 18:38:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(User); groups with view permissions: Set(); users with modify permissions: Set(User); groups with modify permissions: Set()
21/10/24 18:38:46 INFO Utils: Successfully started service 'sparkDriver' on port 56899.
21/10/24 18:38:46 INFO SparkEnv: Registering MapOutputTracker
21/10/24 18:38:46 INFO SparkEnv: Registering BlockManagerMaster
21/10/24 18:38:46 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/10/24 18:38:46 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/10/24 18:38:46 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/10/24 18:38:46 INFO DiskBlockManager: Created local directory at C:\Users\User\AppData\Local\Temp\blockmgr-8df572ec-4206-48ae-b517-bd56242fca4c
21/10/24 18:38:46 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
21/10/24 18:38:46 INFO SparkEnv: Registering OutputCommitCoordinator
21/10/24 18:38:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/10/24 18:38:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://LAPTOP-RKJJV5E4:4040
21/10/24 18:38:46 INFO SparkContext: Added JAR file:/C:/[path]/sparkapp/target/scala-2.12/sparkapp_2.12-1.5.5.jar at spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar with timestamp 1635093525663
21/10/24 18:38:46 INFO Executor: Starting executor ID driver on host LAPTOP-RKJJV5E4
21/10/24 18:38:46 INFO Executor: Fetching spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar with timestamp 1635093525663
21/10/24 18:38:46 INFO TransportClientFactory: Successfully created connection to LAPTOP-RKJJV5E4/192.168.0.175:56899 after 30 ms (0 ms spent in bootstraps)
21/10/24 18:38:46 INFO Utils: Fetching spark://LAPTOP-RKJJV5E4:56899/jars/sparkapp_2.12-1.5.5.jar to C:\Users\User\AppData\Local\Temp\spark-9c13f31f-92a7-4fc5-a87d-a0ae6410e6d2\userFiles-5ac95e31-3656-4a4d-a205-e0750c041bcb\fetchFileTemp7994667570718611461.tmp
21/10/24 18:38:46 ERROR SparkContext: Error initializing SparkContext.
java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:736)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:271)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1120)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1106)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:563)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:953)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:945)
at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:945)
at org.apache.spark.executor.Executor.<init>(Executor.scala:247)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
... 6 more
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:46 INFO SparkUI: Stopped Spark web UI at http://LAPTOP-RKJJV5E4:4040
21/10/24 18:38:46 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:881)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2370)
at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2069)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1419)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2069)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:671)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/10/24 18:38:46 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/24 18:38:46 INFO MemoryStore: MemoryStore cleared
21/10/24 18:38:46 INFO BlockManager: BlockManager stopped
21/10/24 18:38:46 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/24 18:38:46 WARN MetricsSystem: Stopping a MetricsSystem that is not running
21/10/24 18:38:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/24 18:38:46 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:736)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:271)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1120)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1106)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:563)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:953)
at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:945)
at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:945)
at org.apache.spark.executor.Executor.<init>(Executor.scala:247)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at example.bd.MyApp$.main(MyApp.scala:10)
at example.bd.MyApp.main(MyApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814)
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791)
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
... 6 more
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
... 21 more
21/10/24 18:38:47 ERROR Utils: Uncaught exception in thread shutdown-hook-0
java.lang.NullPointerException
at org.apache.spark.executor.Executor.$anonfun$stop$3(Executor.scala:332)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:222)
at org.apache.spark.executor.Executor.stop(Executor.scala:332)
at org.apache.spark.executor.Executor.$anonfun$new$2(Executor.scala:76)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/10/24 18:38:47 INFO ShutdownHookManager: Shutdown hook called
21/10/24 18:38:47 INFO ShutdownHookManager: Deleting directory C:\Users\User\AppData\Local\Temp\spark-ac076276-e6ab-4cfb-a153-56ad35c46d56
21/10/24 18:38:47 INFO ShutdownHookManager: Deleting directory C:\Users\User\AppData\Local\Temp\spark-9c13f31f-92a7-4fc5-a87d-a0ae6410e6d2
As far as I can tell the errors are directly related to Hadoop not being set up, but that shouldn't be an issue given that I am trying to run it locally, no?
Any help will be greatly appreciated.

Drools session will not load after Wildfly Upgrade to 19.0.0

I have a drools implementation using Drools 7.12.0.Final that works perfectly under WildFly-10.1.0.FINAL. We are now trying to upgrade to WildFly-19.0.0.FINAL and the drools module will not load. I see this in the logs as Wildfly tries to start the Drools module;
2020-06-02 16:31:27,214 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Loading kie.conf from vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/drools-compiler-7.12.0.Final.jar/META-INF/kie.conf in classloader ModuleClassLoader for Module "deployment.blsAccountPlusDE-1.0.0-SNAPSHOT.ear.bls-wsa-1.0.0-SNAPSHOT.war" from Service Module Loader
2020-06-02 16:31:27,220 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.drools.compiler.kie.builder.impl.KieServicesImpl
2020-06-02 16:31:27,223 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.drools.compiler.builder.impl.KnowledgeBuilderFactoryServiceImpl
2020-06-02 16:31:27,223 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Loading kie.conf from vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/drools-core-7.12.0.Final.jar/META-INF/kie.conf in classloader ModuleClassLoader for Module "deployment.blsAccountPlusDE-1.0.0-SNAPSHOT.ear.bls-wsa-1.0.0-SNAPSHOT.war" from Service Module Loader
2020-06-02 16:31:27,225 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.drools.core.io.impl.ResourceFactoryServiceImpl
2020-06-02 16:31:27,227 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.drools.core.marshalling.impl.MarshallerProviderImpl
2020-06-02 16:31:27,228 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.drools.core.concurrent.ExecutorProviderImpl
2020-06-02 16:31:27,228 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Loading kie.conf from vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/kie-internal-7.12.0.Final.jar/META-INF/kie.conf in classloader ModuleClassLoader for Module "deployment.blsAccountPlusDE-1.0.0-SNAPSHOT.ear.bls-wsa-1.0.0-SNAPSHOT.war" from Service Module Loader
2020-06-02 16:31:27,230 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.kie.internal.services.KieAssemblersImpl
2020-06-02 16:31:27,231 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.kie.internal.services.KieRuntimesImpl
2020-06-02 16:31:27,232 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.kie.internal.services.KieWeaversImpl
2020-06-02 16:31:27,234 INFO [org.kie.api.internal.utils.ServiceDiscoveryImpl] (default task-1) Adding Service org.kie.internal.services.KieBeliefsImpl
2020-06-02 16:31:27,253 INFO [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Found kmodule: vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/bls-mrv-de-1.0.0-SNAPSHOT.jar/META-INF/kmodule.xml
2020-06-02 16:31:27,260 ERROR [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Error when reading virtual file from vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/bls-mrv-de-1.0.0-SNAPSHOT.jar/META-INF/kmodule.xml: java.lang.IllegalArgumentException: object is not an instance of declaring class
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.drools.compiler.kie.builder.impl.ClasspathKieProject.getPathForVFS(ClasspathKieProject.java:423)
at org.drools.compiler.kie.builder.impl.ClasspathKieProject.fixURLFromKProjectPath(ClasspathKieProject.java:368)
at org.drools.compiler.kie.builder.impl.ClasspathKieProject.fetchKModule(ClasspathKieProject.java:141)
at org.drools.compiler.kie.builder.impl.ClasspathKieProject.discoverKieModules(ClasspathKieProject.java:112)
at org.drools.compiler.kie.builder.impl.ClasspathKieProject.init(ClasspathKieProject.java:84)
at org.drools.compiler.kie.builder.impl.KieContainerImpl.<init>(KieContainerImpl.java:131)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.newKieClasspathContainer(KieServicesImpl.java:135)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.getKieClasspathContainer(KieServicesImpl.java:101)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.getKieClasspathContainer(KieServicesImpl.java:79)
at de.egf.bls.drools.manager.DroolsStatefulRuleEngine.<init>(DroolsStatefulRuleEngine.java:24)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:170)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:87)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1224)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1131)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:541)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:501)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:204)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.autowireResource(CommonAnnotationBeanPostProcessor.java:513)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.getResource(CommonAnnotationBeanPostProcessor.java:484)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor$ResourceElement.getResourceToInject(CommonAnnotationBeanPostProcessor.java:618)
at org.springframework.beans.factory.annotation.InjectionMetadata$InjectedElement.inject(InjectionMetadata.java:177)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:91)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.postProcessPropertyValues(CommonAnnotationBeanPostProcessor.java:318)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1344)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:578)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:501)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:204)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.autowireResource(CommonAnnotationBeanPostProcessor.java:513)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.getResource(CommonAnnotationBeanPostProcessor.java:484)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor$ResourceElement.getResourceToInject(CommonAnnotationBeanPostProcessor.java:618)
at org.springframework.beans.factory.annotation.InjectionMetadata$InjectedElement.inject(InjectionMetadata.java:177)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:91)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.postProcessPropertyValues(CommonAnnotationBeanPostProcessor.java:318)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1344)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:578)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:501)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.aop.framework.ProxyFactoryBean.freshTargetSource(ProxyFactoryBean.java:588)
at org.springframework.aop.framework.ProxyFactoryBean.getSingletonInstance(ProxyFactoryBean.java:319)
at org.springframework.aop.framework.ProxyFactoryBean.getObject(ProxyFactoryBean.java:254)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:163)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:101)
at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1645)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getObjectForBeanInstance(AbstractAutowireCapableBeanFactory.java:1178)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:327)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.aop.framework.ProxyFactoryBean.freshTargetSource(ProxyFactoryBean.java:588)
at org.springframework.aop.framework.ProxyFactoryBean.getSingletonInstance(ProxyFactoryBean.java:319)
at org.springframework.aop.framework.ProxyFactoryBean.getObject(ProxyFactoryBean.java:254)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:163)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:101)
at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1645)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getObjectForBeanInstance(AbstractAutowireCapableBeanFactory.java:1178)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:327)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1089)
at de.egf.bls.wsa.endpoint.BlsWsaEndpoint.getParameterTypes(BlsWsaEndpoint.java:110)
at de.egf.bls.wsa.endpoint.BlsWsaEndpoint.createCommandTO(BlsWsaEndpoint.java:87)
at de.egf.bls.wsa.endpoint.BlsWsaEndpoint.invoke(BlsWsaEndpoint.java:52)
at org.springframework.ws.server.endpoint.adapter.PayloadEndpointAdapter.invoke(PayloadEndpointAdapter.java:50)
at org.springframework.ws.server.MessageDispatcher.dispatch(MessageDispatcher.java:236)
at org.springframework.ws.server.MessageDispatcher.receive(MessageDispatcher.java:176)
at org.springframework.ws.transport.support.WebServiceMessageReceiverObjectSupport.handleConnection(WebServiceMessageReceiverObjectSupport.java:89)
at org.springframework.ws.transport.http.WebServiceMessageReceiverHandlerAdapter.handle(WebServiceMessageReceiverHandlerAdapter.java:61)
at org.springframework.ws.transport.http.MessageDispatcherServlet.doService(MessageDispatcherServlet.java:293)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:974)
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:877)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:523)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:851)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:590)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at io.opentracing.contrib.jaxrs2.server.SpanFinishingFilter.doFilter(SpanFinishingFilter.java:52)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68)
at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:132)
at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50)
at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:269)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:133)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:130)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at org.wildfly.extension.undertow.security.SecurityContextThreadSetupAction.lambda$create$0(SecurityContextThreadSetupAction.java:105)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1541)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1541)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1541)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1541)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1541)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:249)
at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:99)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:376)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at org.jboss.threads.ContextClassLoaderSavingRunnable.run(ContextClassLoaderSavingRunnable.java:35)
at org.jboss.threads.EnhancedQueueExecutor.safeRun(EnhancedQueueExecutor.java:1982)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.doRunTask(EnhancedQueueExecutor.java:1486)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1377)
at java.lang.Thread.run(Thread.java:748)
2020-06-02 16:31:27,438 WARN [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Unable to find pom.properties in /C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/bls-mrv-de-1.0.0-SNAPSHOT.jar/META-INF/kmodule.xml
2020-06-02 16:31:27,439 WARN [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) As folder project tried to fall back to pom.xml, but could not find one
2020-06-02 16:31:27,440 WARN [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Unable to load pom.properties from/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/bls-mrv-de-1.0.0-SNAPSHOT.jar/META-INF/kmodule.xml
2020-06-02 16:31:27,441 WARN [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Cannot find maven pom properties for this project. Using the container's default ReleaseId
2020-06-02 16:31:27,446 ERROR [org.drools.compiler.kie.builder.impl.ClasspathKieProject] (default task-1) Unable to build index of kmodule.xml url=vfs:/C:/Users/roger.varley/Downloads/wildfly-19.0.0.Final/bin/content/blsAccountPlusDE-1.0.0-SNAPSHOT.ear/lib/bls-mrv-de-1.0.0-SNAPSHOT.jar/META-INF/kmodule.xml
Unable to get all ZipFile entries: C:\Users\roger.varley\Downloads\wildfly-19.0.0.Final\bin\content\blsAccountPlusDE-1.0.0-SNAPSHOT.ear\lib\bls-mrv-de-1.0.0-SNAPSHOT.jar\META-INF\kmodule.xml
My kmodule.xml looks like
<?xml version="1.0" encoding="UTF-8"?>
<kmodule xmlns="http://www.drools.org/xsd/kmodule">
<kbase packages="de.egf.bls.mrv.de.metervalidation.active.common, de.egf.bls.mrv.de.metervalidation.active.daily">
<ksession name="DailyMeterReadValidation" />
</kbase>
<kbase packages="de.egf.bls.mrv.de.metervalidation.active.common, de.egf.bls.mrv.de.metervalidation.active.full">
<ksession name="FullMeterReadValidation" />
</kbase>
</kmodule>
I know going from 10.0.1 to 19.0.0 is a big jump but I have no choice in the matter. Can anyone point me to what is wrong. I've seen some StackOverflow posts that suggest that the xmlns should be changed to
kmodule xmlns="http://jboss.org/kie/6.0.0/kmodule"
but that has made no difference. Any help would be appreciated.
The error was happening in ClasspathKieProject when Drools was trying to use org.jboss.vfs.VirtualFile to read the kmodule.xml. This is caused by changes to the JBoss Virtual File System that WildFly uses (details of the fix are at https://github.com/kiegroup/drools/commit/2a36f67a29ed06d0f980a60ff1c81fa897e8147a)
However, upgrading to the latest Drools (7.37.0.Final at this time) fixes the problem.

Hotswap Agent / Wildfly / Weld: Exception when reloading: BeanManagerImpl.contexts not accessible

I'm using WildFly, and have properly configured Hotswap Agent 1.0 as far as I can see. It's starting up nicely with the message:
HOTSWAP AGENT: 20:01:19.893 INFO (org.hotswap.agent.HotswapAgent) - Loading Hotswap agent {1.0} - unlimited runtime class redefinition.
HOTSWAP AGENT: 20:01:21.096 INFO (org.hotswap.agent.config.PluginRegistry) - Discovered plugins: [Hotswapper, WatchResources, AnonymousClassPatch, ClassInitPlugin, Hibernate, Hibernate3JPA, Hibernate3, Spring, Jersey1, Jersey2, Jetty, Tomcat, ZK, Logback, Log4j2, MyFaces, Mojarra, Seam, ELResolver, WildFlyELResolver, OsgiEquinox, Proxy, WebObjects, Weld, JBossModules, ResteasyRegistry, Deltaspike, JavaBeans]
20:01:22,363 INFO [org.jboss.modules] (main) JBoss Modules version 1.5.1.Final
However every time I modify a class, I get the following WARNING in the console log. Modified classes are not properly reloaded.
19:56:07,946 INFO [stdout] (Thread-281) HOTSWAP AGENT: 19:56:07.946 WARNING (org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent) - BeanManagerImpl.contexts not accessible
19:56:07,947 INFO [stdout] (Thread-281) java.lang.NoSuchFieldException: contexts
19:56:07,948 INFO [stdout] (Thread-281) at java.lang.Class.getField(Class.java:1703)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.getContexts(BeanDeploymentArchiveAgent.java:269)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadManagedBeanInContexts(BeanDeploymentArchiveAgent.java:318)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadManagedBean(BeanDeploymentArchiveAgent.java:297)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.reloadBean(BeanDeploymentArchiveAgent.java:231)
19:56:07,948 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanDeploymentArchiveAgent.refreshBeanClass(BeanDeploymentArchiveAgent.java:192)
19:56:07,948 INFO [stdout] (Thread-281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19:56:07,949 INFO [stdout] (Thread-281) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19:56:07,949 INFO [stdout] (Thread-281) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19:56:07,949 INFO [stdout] (Thread-281) at java.lang.reflect.Method.invoke(Method.java:497)
19:56:07,949 INFO [stdout] (Thread-281) at org.hotswap.agent.plugin.weld.command.BeanClassRefreshCommand.executeCommand(BeanClassRefreshCommand.java:93)
19:56:07,949 INFO [stdout] (Thread-281) at org.hotswap.agent.command.impl.CommandExecutor.run(CommandExecutor.java:25)
19:56:07,950 INFO [stdout] (Thread-281)
19:56:08,030 INFO [stdout] (Thread-282) HOTSWAP AGENT: 19:56:08.030 INFO (org.hotswap.agent.plugin.wildfly.el.PurgeWildFlyBeanELResolverCacheCommand) - Cleaning BeanPropertiesCache lgc.mysynap.ui.components.SourcePopup ModuleClassLoader for Module "deployment.mysynap-0.1.war:main" from Service Module Loader.
Any idea what could be the cause?
I had the same problem, after disabling all breakpoint in the running time, it will work.
then you need to activate breakpoints again.

Spark Internal REST API: Unable to find dependent jars

I am trying to submit the Spark program using the SPARK internal REST API.
Request for submitting the program below. The required supporting jars are in place.
curl -X POST http://quickstart.cloudera:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "SampleSparkProgramApp" ],
"appResource" : "file:///home/cloudera/test_sample_example/spark-example.jar",
"clientSparkVersion" : "1.5.0",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "com.example.SampleSparkProgram",
"sparkProperties" : {
"spark.jars" : "file:///home/cloudera/test_sample_example/lib/mongo-hadoop-core-1.0-snapshot.jar,file:///home/cloudera/test_sample_example/lib/mongo-java-driver-3.0.4.jar,file:///home/cloudera/test_sample_example/lib/lucene-analyzers-common-5.4.0.jar,file:///home/cloudera/test_sample_example/lib/lucene-core-5.2.1.jar",
"spark.driver.supervise" : "false",
"spark.app.name" : "MyJob",
"spark.eventLog.enabled": "true",
"spark.submit.deployMode" : "client",
"spark.master" : "spark://quickstart.cloudera:6066"
}
}'
The class com.mongodb.hadoop.MongoInputFormat is avalible in mongo-hadoop-core-1.0-snapshot.jar and the jar is added in the request with the key "spark.jars".
I am getting below error in spark UI logs.
1.5.0-cdh5.5.0 stderr log page for driver-20160121040910-0026
Back to Master
Bytes 0 - 12640 of 12640
Launch Command: "/usr/java/jdk1.7.0_67-cloudera/jre/bin/java" "-cp" "/usr/lib/spark/sbin/../conf/:/usr/lib/spark/lib/spark-assembly-1.5.0-cdh5.5.0-hadoop2.6.0-cdh5.5.0.jar:/etc/hadoop/conf/:/usr/lib/spark/sbin/../lib/spark-assembly.jar:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop-mapreduce/lib/*:/usr/lib/hadoop-mapreduce/*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/*:/usr/lib/hive/lib/*:/usr/lib/flume-ng/lib/*:/usr/lib/paquet/lib/*:/usr/lib/avro/lib/*" "-Xms1024M" "-Xmx1024M" "-Dspark.eventLog.enabled=true" "-Dspark.driver.supervise=false" "-Dspark.app.name=MyJob" "-Dspark.jars=file:///home/cloudera/test_sample_example/lib/mongo-hadoop-core-1.0-snapshot.jar,file:///home/cloudera/test_sample_example/lib/mongo-java-driver-3.0.4.jar,file:///home/cloudera/test_sample_example/lib/lucene-analyzers-common-5.4.0.jar,file:///home/cloudera/test_sample_example/lib/lucene-core-5.2.1.jar" "-Dspark.master=spark://quickstart.cloudera:7077" "-Dspark.submit.deployMode=client" "-XX:MaxPermSize=256m" "org.apache.spark.deploy.worker.DriverWrapper" "akka.tcp://sparkWorker#182.162.106.131:7078/user/Worker" "/var/run/spark/work/driver-20160121040910-0026/spark-example.jar" "com.example.SampleSparkProgram" "SampleSparkProgramApp"
========================================
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/01/21 04:09:16 WARN util.Utils: Your hostname, quickstart.cloudera resolves to a loopback address: 127.0.0.1; using 182.162.106.131 instead (on interface eth1)
16/01/21 04:09:16 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/01/21 04:09:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/01/21 04:09:21 INFO spark.SecurityManager: Changing view acls to: root
16/01/21 04:09:21 INFO spark.SecurityManager: Changing modify acls to: root
16/01/21 04:09:21 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/01/21 04:09:26 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/01/21 04:09:26 INFO Remoting: Starting remoting
16/01/21 04:09:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://Driver#182.162.106.131:38181]
16/01/21 04:09:27 INFO Remoting: Remoting now listens on addresses: [akka.tcp://Driver#182.162.106.131:38181]
16/01/21 04:09:27 INFO util.Utils: Successfully started service 'Driver' on port 38181.
16/01/21 04:09:27 INFO worker.WorkerWatcher: Connecting to worker akka.tcp://sparkWorker#182.162.106.131:7078/user/Worker
16/01/21 04:09:28 INFO spark.SparkContext: Running Spark version 1.5.0-cdh5.5.0
16/01/21 04:09:28 INFO worker.WorkerWatcher: Successfully connected to akka.tcp://sparkWorker#182.162.106.131:7078/user/Worker
16/01/21 04:09:28 INFO spark.SecurityManager: Changing view acls to: root
16/01/21 04:09:28 INFO spark.SecurityManager: Changing modify acls to: root
16/01/21 04:09:28 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/01/21 04:09:29 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/01/21 04:09:29 INFO Remoting: Starting remoting
16/01/21 04:09:29 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver#182.162.106.131:35467]
16/01/21 04:09:29 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver#182.162.106.131:35467]
16/01/21 04:09:29 INFO util.Utils: Successfully started service 'sparkDriver' on port 35467.
16/01/21 04:09:29 INFO spark.SparkEnv: Registering MapOutputTracker
16/01/21 04:09:30 INFO spark.SparkEnv: Registering BlockManagerMaster
16/01/21 04:09:30 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6b24e210-4002-4e28-ac60-c2ecc497b914
16/01/21 04:09:30 INFO storage.MemoryStore: MemoryStore started with capacity 534.5 MB
16/01/21 04:09:31 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-bc7bc70e-af91-44cb-a764-8c6d1d9b3acc/httpd-65b7bbf1-af6d-4252-8629-95fcb60f706f
16/01/21 04:09:31 INFO spark.HttpServer: Starting HTTP Server
16/01/21 04:09:31 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/01/21 04:09:31 INFO server.AbstractConnector: Started SocketConnector#0.0.0.0:38126
16/01/21 04:09:31 INFO util.Utils: Successfully started service 'HTTP file server' on port 38126.
16/01/21 04:09:31 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/01/21 04:09:33 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/01/21 04:09:33 INFO server.AbstractConnector: Started SelectChannelConnector#0.0.0.0:4040
16/01/21 04:09:33 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/01/21 04:09:33 INFO ui.SparkUI: Started SparkUI at http://182.162.106.131:4040
16/01/21 04:09:33 INFO spark.SparkContext: Added JAR hdfs:///user/cloudera/sample_example/lib/mongo-hadoop-core-1.0-snapshot.jar at hdfs:///user/cloudera/sample_example/lib/mongo-hadoop-core-1.0-snapshot.jar with timestamp 1453378173778
16/01/21 04:09:33 INFO spark.SparkContext: Added JAR hdfs:///user/cloudera/sample_example/lib/mongo-java-driver-3.0.4.jar at hdfs:///user/cloudera/sample_example/lib/mongo-java-driver-3.0.4.jar with timestamp 1453378173782
16/01/21 04:09:33 INFO spark.SparkContext: Added JAR hdfs:///user/cloudera/sample_example/lib/lucene-analyzers-common-5.4.0.jar at hdfs:///user/cloudera/sample_example/lib/lucene-analyzers-common-5.4.0.jar with timestamp 1453378173783
16/01/21 04:09:33 INFO spark.SparkContext: Added JAR hdfs:///user/cloudera/sample_example/lib/lucene-core-5.2.1.jar at hdfs:///user/cloudera/sample_example/lib/lucene-core-5.2.1.jar with timestamp 1453378173783
16/01/21 04:09:34 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/01/21 04:09:34 INFO client.AppClient$ClientEndpoint: Connecting to master spark://quickstart.cloudera:7077...
16/01/21 04:09:35 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160121040935-0025
16/01/21 04:09:36 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38749.
16/01/21 04:09:36 INFO netty.NettyBlockTransferService: Server created on 38749
16/01/21 04:09:36 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/01/21 04:09:36 INFO storage.BlockManagerMasterEndpoint: Registering block manager 182.162.106.131:38749 with 534.5 MB RAM, BlockManagerId(driver, 182.162.106.131, 38749)
16/01/21 04:09:36 INFO storage.BlockManagerMaster: Registered BlockManager
16/01/21 04:09:40 INFO scheduler.EventLoggingListener: Logging events to file:/tmp/spark-events/app-20160121040935-0025
16/01/21 04:09:40 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
16/01/21 04:09:40 INFO analyser.NaiveByesAnalyserFactory: ENTERING
16/01/21 04:09:40 INFO dao.MongoDataExtractor: ENTERING
16/01/21 04:09:41 INFO dao.MongoDataExtractor: EXITING
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat
at com.examples.dao.MongoDataExtractor.getData(MongoDataExtractor.java:35)
at com.examples.analyser.NaiveByesAnalyserFactory.getNaiveByesAnalyserFactory(NaiveByesAnalyserFactory.java:27)
at com.example.SampleSparkProgram.main(SampleSparkProgram.java:24)
... 6 more
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.MongoInputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
16/01/21 04:09:41 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/01/21 04:09:41 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/01/21 04:09:41 INFO ui.SparkUI: Stopped Spark web UI at http://182.162.106.131:4040
16/01/21 04:09:41 INFO scheduler.DAGScheduler: Stopping DAGScheduler
16/01/21 04:09:41 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
16/01/21 04:09:41 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
16/01/21 04:09:41 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/01/21 04:09:41 INFO storage.MemoryStore: MemoryStore cleared
16/01/21 04:09:41 INFO storage.BlockManager: BlockManager stopped
16/01/21 04:09:41 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/01/21 04:09:41 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/01/21 04:09:41 INFO spark.SparkContext: Successfully stopped SparkContext
16/01/21 04:09:41 INFO util.ShutdownHookManager: Shutdown hook called
16/01/21 04:09:41 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-bc7bc70e-af91-44cb-a764-8c6d1d9b3acc

Maps Pro working in Jasper Studio but not on IBM WAS Server

I installed the TIBCO Jaspersoft Studio Professional - Visual Designer for JasperReports verion 6.0.1 30 Day Trial. I was able to create simple bar and pie chart and after compiling the jrxml into jasper i was also able to deploy them on IBM WAS Server, all of them worked like charm. But when i created chart using Maps Pro i was able to view it correctly in Preview but when i compiled it to jasper and deployed it to IBM WAS server it didn't worked.
In my view following was root cause:
Caused by: java.lang.ClassNotFoundException: com.jaspersoft.jasperreports.fusion.maps.StandardMapComponent from [Module "deployment.siperian-mrm.ear.zds-gui.war:main" from Service Module Loader].
Does in Trial version this doesn't work or i am doing something wrong, or need some classpath settings?
Following is the detail log for the error i am getting in server:
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
com.siperian.dsapp.domain.common.DomainLevelException: EXCEPTION||An
unexpected error occurred. 18:17:55,856 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
com.siperian.dsapp.mde.domain.report.MDEReportService.generateReport(MDEReportService.java:70)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.mde.jsf.server.report.ReportHandler.handleRequest(ReportHandler.java:61)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.context.support.HttpRequestHandlerServlet.service(HttpRequestHandlerServlet.java:63)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:295)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.jsf.server.security.AbstractSecurityFilter.doFilter(AbstractSecurityFilter.java:210)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.jsf.server.security.StandardSecurityFilter.doFilter(StandardSecurityFilter.java:112)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:236)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:231)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:149)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.as.web.security.SecurityContextAssociationValve.invoke(SecurityContextAssociationValve.java:169)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:145)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:97)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:102)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:344)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:856)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:653)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:926)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at java.lang.Thread.run(Thread.java:745) 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
com.siperian.dsapp.datasource.report.DBReportServiceException:
net.sf.jasperreports.engine.JRException: Class not found when loading
object from InputStream 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
com.siperian.dsapp.datasource.report.DBReportService.generateReport(DBReportService.java:34)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.mde.domain.report.MDEReportService.generateReport(MDEReportService.java:68)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
... 22 more 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
net.sf.jasperreports.engine.JRException: Class not found when loading
object from InputStream 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:253)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:229)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.JasperFillManager.fill(JasperFillManager.java:405)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillManager.java:824)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.datasource.report.DBReportService.getJasperPrint(DBReportService.java:40)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
com.siperian.dsapp.datasource.report.DBReportService.generateReport(DBReportService.java:25)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
... 23 more 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) Caused by:
java.lang.ClassNotFoundException:
com.jaspersoft.jasperreports.fusion.maps.StandardMapComponent from
[Module "deployment.siperian-mrm.ear.zds-gui.war:main" from Service
Module Loader] 18:17:55,866 INFO [stdout]
(AsyncAppender-Dispatcher-Thread-144) at
org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:213)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:459)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClassChecked(ConcurrentClassLoader.java:408)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:389)
18:17:55,866 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
at
org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:134)
18:17:55,856 INFO [stdout] (AsyncAppender-Dispatcher-Thread-144)
[2015-03-04 18:17:55,846] [http-/192.168.1.14:8080-5] [ERROR]
com.siperian.dsapp.mde.jsf.server.report.ReportHandler: SIP-49000
Following is the header in my jrxml:
<jasperReport xmlns="http://jasperreports.sourceforge.net/jasperreports" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://jasperreports.sourceforge.net/jasperreports http://jasperreports.sourceforge.net/xsd/jasperreport.xsd" name="SupplierStateWiseMap" pageWidth="900" pageHeight="250" orientation="Landscape" columnWidth="900" leftMargin="0" rightMargin="0" topMargin="0" bottomMargin="0" uuid="88aacb86-d339-42d7-9586-69cebc5e763f">
Any help would be highly appreciated
to me it appears related to missing required libraries for JasperReports PRO components. Fusion Charts/Widgets/Maps are with HTML5 charts part of the Jaspersoft Professional package.
I'm not sure how it is configured your server, but you should check to have the related jars + license information (even if trial) + jasperreports properties set.
Did you tried to test your reports into a test trial installation of JasperReports Server PRO ?
Regards,
Massimo.