JAX-WS Apache CXF Connection Error - soap

The Apache CXF and the JAX-WS always seems to cause issues. I'm not sure if I'm using it wrongly!
Here is the exception that I get:
20151203-15:14:06.836+0100 [play-akka.actor.default-dispatcher-9] ERROR my.proj - Asset E911S (SOAP) -- unknown error occurred at 2015-12-03T14:14:05.778Z while reading tag getSlave
javax.xml.ws.WebServiceException: Could not send Message.
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:150) ~[cxf-rt-frontend-jaxws-3.1.4.jar:3.1.4]
at com.sun.proxy.$Proxy57.getSlave(Unknown Source) ~[na:na]
at my.proj.connectors.soap.MyServiceClient$$anonfun$getSlave$1.apply(MyServiceClient.scala:174) ~[classes/:na]
at my.proj.connectors.soap.MyServiceClient$$anonfun$getSlave$1.apply(MyServiceClient.scala:97) ~[classes/:na]
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) ~[scala-library-2.11.7.jar:na]
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) ~[scala-library-2.11.7.jar:na]
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41) [akka-actor_2.11-2.3.4.jar:na]
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) [akka-actor_2.11-2.3.4.jar:na]
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.11.7.jar:na]
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.11.7.jar:na]
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.11.7.jar:na]
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.11.7.jar:na]
Caused by: java.net.SocketTimeoutException: SocketTimeoutException invoking http://1/c.2.3.4/gi-bin/cgi.cgi?WebService: Read timed out
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_60]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_60]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_60]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[na:1.8.0_60]
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.mapException(HTTPConduit.java:1376) ~[cxf-rt-transports-http-3.1.4.jar:3.1.4]
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.close(HTTPConduit.java:1360) ~[cxf-rt-transports-http-3.1.4.jar:3.1.4]
at org.apache.cxf.transport.AbstractConduit.close(AbstractConduit.java:56) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.transport.http.HTTPConduit.close(HTTPConduit.java:651) ~[cxf-rt-transports-http-3.1.4.jar:3.1.4]
at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:62) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:308) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:514) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:423) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:324) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:277) ~[cxf-core-3.1.4.jar:3.1.4]
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:96) ~[cxf-rt-frontend-simple-3.1.4.jar:3.1.4]
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:139) ~[cxf-rt-frontend-jaxws-3.1.4.jar:3.1.4]
... 11 common frames omitted
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method) ~[na:1.8.0_60]
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) ~[na:1.8.0_60]
at java.net.SocketInputStream.read(SocketInputStream.java:170) ~[na:1.8.0_60]
at java.net.SocketInputStream.read(SocketInputStream.java:141) ~[na:1.8.0_60]
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) ~[na:1.8.0_60]
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) ~[na:1.8.0_60]
Here is what I do to set the context!
private def setRequestContext(proxy: Client, withCredentials: Boolean = false): Unit = {
// set the time out!
val httpConduit = proxy.getConduit.asInstanceOf[HTTPConduit]
val httpClientPolicy = new HTTPClientPolicy()
httpClientPolicy.setConnectionTimeout(mySoapClientConfig.requestTimeout.toSeconds)
httpClientPolicy.setReceiveTimeout(mySoapClientConfig.requestTimeout.toSeconds)
httpConduit.setClient(httpClientPolicy)
if (withCredentials) {
val userCredentials = new UserCredentials
userCredentials.setPassword(mySoapClientConfig.password.orNull)
userCredentials.setUserName(mySoapClientConfig.userName.orNull)
val headerList = Seq(
new Header(
new QName(mySoapClientConfig.targetNamespace, "UserCredentials"),
userCredentials,
new JAXBDataBinding(classOf[UserCredentials])
)
)
import scala.collection.JavaConverters._
proxy.getRequestContext.put(Header.HEADER_LIST, headerList.asJava)
}
}
Could anyone please help me out? I'm able to connect to the WebService using SOAP UI!

I figured out what the problem was. Pay close attention to the following line:
httpClientPolicy.setConnectionTimeout(mySoapClientConfig.requestTimeout.toSeconds)
httpClientPolicy.setReceiveTimeout(mySoapClientConfig.requestTimeout.toSeconds)
The setXXXTimeout takes a long in millis and what I have been passing in was 4 seconds which was understood as 4 millis, which was causing the read time outs!

Related

spark-shell not working on installing apache spark. Error: system cannot find the path specified

I installed Apache Spark, have java and python installed as well. Set up the environment variables as per this article: https://phoenixnap.com/kb/install-spark-on-windows-10
I have also installed winutils.exe.
Initially I was getting an error like :
missing Python executable
defaulting to 'C:\spark\bin..' for SPARK_HOME environment variable.
Please install Python or specify the correct Python executable in PYSPARK_DRIVER_PYTHO
N or PYSPARK_PYTHON environment variable to detect SPARK_HOME safely. To fix this i added a environment variable
Name: PYSPARK_DRIVER_PYTHON
value: C:\Program Files\Python310
Now, on the terminal, C:\spark\bin>spark-shell gives the following output:
The system cannot find the path specified.
The system cannot find the path specified.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/spark/jars/spark-unsafe_2.12-3.2.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/06/03 14:18:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/06/03 14:18:17 ERROR SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
at $line3.$read$$iw$$iw.<init>(<console>:15)
at $line3.$read$$iw.<init>(<console>:42)
at $line3.$read.<init>(<console>:44)
at $line3.$read$.<init>(<console>:48)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.$print$lzycompute(<console>:7)
at $line3.$eval$.$print(<console>:6)
at $line3.$eval.$print(<console>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$2(SparkILoop.scala:83)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$1(SparkILoop.scala:83)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
at org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:165)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:166)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
at org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
at org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
at org.apache.spark.repl.Main$.doMain(Main.scala:78)
at org.apache.spark.repl.Main$.main(Main.scala:58)
at org.apache.spark.repl.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.URISyntaxException: Illegal character in path at index 32: spark://DESKTOP-E76IK3L:52769/C:\classes
at java.base/java.net.URI$Parser.fail(URI.java:2913)
at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
at java.base/java.net.URI$Parser.parse(URI.java:3114)
at java.base/java.net.URI.<init>(URI.java:600)
at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
... 70 more
22/06/03 14:18:17 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:927)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2567)
at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2086)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2086)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:677)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
at $line3.$read$$iw$$iw.<init>(<console>:15)
at $line3.$read$$iw.<init>(<console>:42)
at $line3.$read.<init>(<console>:44)
at $line3.$read$.<init>(<console>:48)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.$print$lzycompute(<console>:7)
at $line3.$eval$.$print(<console>:6)
at $line3.$eval.$print(<console>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$2(SparkILoop.scala:83)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$1(SparkILoop.scala:83)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
at org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:165)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:166)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
at org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
at org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
at org.apache.spark.repl.Main$.doMain(Main.scala:78)
at org.apache.spark.repl.Main$.main(Main.scala:58)
at org.apache.spark.repl.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/06/03 14:18:17 WARN MetricsSystem: Stopping a MetricsSystem that is not running
22/06/03 14:18:17 ERROR Main: Failed to initialize Spark session.
java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
at $line3.$read$$iw$$iw.<init>(<console>:15)
at $line3.$read$$iw.<init>(<console>:42)
at $line3.$read.<init>(<console>:44)
at $line3.$read$.<init>(<console>:48)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.$print$lzycompute(<console>:7)
at $line3.$eval$.$print(<console>:6)
at $line3.$eval.$print(<console>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$2(SparkILoop.scala:83)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$1(SparkILoop.scala:83)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
at org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:165)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:166)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
at org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
at org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
at org.apache.spark.repl.Main$.doMain(Main.scala:78)
at org.apache.spark.repl.Main$.main(Main.scala:58)
at org.apache.spark.repl.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.URISyntaxException: Illegal character in path at index 32: spark://DESKTOP-E76IK3L:52769/C:\classes
at java.base/java.net.URI$Parser.fail(URI.java:2913)
at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
at java.base/java.net.URI$Parser.parse(URI.java:3114)
at java.base/java.net.URI.<init>(URI.java:600)
at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
... 70 more
22/06/03 14:18:17 ERROR Utils: Uncaught exception in thread shutdown-hook-0
java.lang.ExceptionInInitializerError
at org.apache.spark.executor.Executor.stop(Executor.scala:333)
at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.NullPointerException
at org.apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:465)
at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
... 16 more
22/06/03 14:18:17 WARN ShutdownHookManager: ShutdownHook '' failed, java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
Caused by: java.lang.ExceptionInInitializerError
at org.apache.spark.executor.Executor.stop(Executor.scala:333)
at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.NullPointerException
at org.apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:465)
at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
... 16 more
How do I fix this?
I think there is issue with your computer name , that contains string and spark libs/scripts not able to understand the same.
spark://DESKTOP-E76IK3L:52769/
can you change your computer/Laptop name to any string(no number, colons in between)
something like : MyDesktop
and try running spark-shell

Error using reactivemongo 0.12.1 with play 2.5.X

I tried upgrading to reactive mongo 0.12.1 with play 2.5.12 but when I run the JVM quits on me and I get the following stack trace:
Uncaught error from thread [application-akka.actor.default-dispatcher-2] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[application] java.lang.NoClassDefFoundError: play/api/libs/concurrent/StateMachine at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at play.api.libs.streams.impl.EnumeratorSubscriptionFactory$class.createSubscription(EnumeratorPublisher.scala:25) at play.api.libs.streams.impl.EnumeratorPublisher.createSubscription(EnumeratorPublisher.scala:33) at play.api.libs.streams.impl.EnumeratorPublisher.createSubscription(EnumeratorPublisher.scala:33) at play.api.libs.streams.impl.RelaxedPublisher.subscribe(RelaxedPublisher.scala:19) at akka.stream.impl.MaterializerSession.akka$stream$impl$MaterializerSession$$doSubscribe(StreamLayout.scala:1033) at akka.stream.impl.MaterializerSession.assignPort(StreamLayout.scala:1025) at akka.stream.impl.MaterializerSession$$anonfun$exitScope$2.apply(StreamLayout.scala:907) at akka.stream.impl.MaterializerSession$$anonfun$exitScope$2.apply(StreamLayout.scala:906) at scala.collection.Iterator$class.foreach(Iterator.scala:893) at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at akka.stream.impl.MaterializerSession.exitScope(StreamLayout.scala:906) at akka.stream.impl.MaterializerSession$$anonfun$materializeModule$1.apply(StreamLayout.scala:958) at akka.stream.impl.MaterializerSession$$anonfun$materializeModule$1.apply(StreamLayout.scala:950) at scala.collection.immutable.Set$Set3.foreach(Set.scala:163) at akka.stream.impl.MaterializerSession.materializeModule(StreamLayout.scala:950) at akka.stream.impl.MaterializerSession.materialize(StreamLayout.scala:917) at akka.stream.impl.ActorMaterializerImpl.materialize(ActorMaterializerImpl.scala:256) at akka.stream.impl.ActorMaterializerImpl.materialize(ActorMaterializerImpl.scala:146) at akka.stream.scaladsl.RunnableGraph.run(Flow.scala:350) at akka.stream.scaladsl.Source.runWith(Source.scala:81) at play.core.server.netty.NettyModelConversion.play$core$server$netty$NettyModelConversion$$createChunkedResponse(NettyModelConversion.scala:256) at play.core.server.netty.NettyModelConversion$$anonfun$convertResult$1.apply(NettyModelConversion.scala:189) at play.core.server.netty.NettyModelConversion$$anonfun$convertResult$1.apply(NettyModelConversion.scala:166) at play.core.server.common.ServerResultUtils$.resultConversionWithErrorHandling(ServerResultUtils.scala:127) at play.core.server.netty.NettyModelConversion.convertResult(NettyModelConversion.scala:235) at play.core.server.netty.PlayRequestHandler$$anonfun$play$core$server$netty$PlayRequestHandler$$handleAction$2$$anonfun$apply$3.apply(PlayRequestHandler.scala:273) at play.core.server.netty.PlayRequestHandler$$anonfun$play$core$server$netty$PlayRequestHandler$$handleAction$2$$anonfun$apply$3.apply(PlayRequestHandler.scala:267) at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:253) at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:251) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at play.api.libs.iteratee.Execution$trampoline$.executeScheduled(Execution.scala:110) at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153) at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:436) at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:435) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91) at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91) at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90) at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: java.lang.ClassNotFoundException: play.api.libs.concurrent.StateMachine
Any help will be appreciated!
After examining the dependencies graph (https://github.com/jrudolph/sbt-dependency-graph), What ended up working for me is to exclude play-iteratees from ReactiveMongo.
In my build.sbt, change the import to:
"org.reactivemongo" %% "play2-reactivemongo" % reactiveMongoVersion excludeAll( ExclusionRule("com.typesafe.play", "play-iteratees_2.11")

Flink : WordCount example throws ClassNotFoundException

I'm currently trying to embed Flink in a Play Framework project. But when I try to execute the WordCount example from Flink documentation I get this error:
play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[JobExecutionException: Cannot initialize task 'DataSink (collect())': Deserializing the OutputFormat (org.apache.flink.api.java.Utils$CollectHelper#1f158fd6) failed: Could not read the user code wrapper: io.fabrick.aperture.doctor.stream.WordCount$$anon$2$$anon$1]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:265) ~[play_2.11-2.4.2.jar:2.4.2]
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:191) ~[play_2.11-2.4.2.jar:2.4.2]
at play.api.GlobalSettings$class.onError(GlobalSettings.scala:179) [play_2.11-2.4.2.jar:2.4.2]
at play.api.DefaultGlobal$.onError(GlobalSettings.scala:212) [play_2.11-2.4.2.jar:2.4.2]
at play.api.http.GlobalSettingsHttpErrorHandler.onServerError(HttpErrorHandler.scala:94) [play_2.11-2.4.2.jar:2.4.2]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:158) [play-netty-server_2.11-2.4.2.jar:2.4.2]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:155) [play-netty-server_2.11-2.4.2.jar:2.4.2]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) [scala-library-2.11.7.jar:na]
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:216) [scala-library-2.11.7.jar:na]
at scala.util.Try$.apply(Try.scala:192) [scala-library-2.11.7.jar:na]
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'DataSink (collect())': Deserializing the OutputFormat (org.apache.flink.api.java.Utils$CollectHelper#1f158fd6) failed: Could not read the user code wrapper: io.fabrick.aperture.doctor.stream.WordCount$$anon$2$$anon$1
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:1012) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at scala.collection.Iterator$class.foreach(Iterator.scala:742) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1194) ~[scala-library-2.11.7.jar:na]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[scala-library-2.11.7.jar:na]
at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:380) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) [scala-library-2.11.7.jar:na]
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
Caused by: java.lang.Exception: Deserializing the OutputFormat (org.apache.flink.api.java.Utils$CollectHelper#1f158fd6) failed: Could not read the user code wrapper: io.fabrick.aperture.doctor.stream.WordCount$$anon$2$$anon$1
at org.apache.flink.runtime.jobgraph.OutputFormatVertex.initializeOnMaster(OutputFormatVertex.java:63) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:1008) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at scala.collection.Iterator$class.foreach(Iterator.scala:742) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1194) ~[scala-library-2.11.7.jar:na]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[scala-library-2.11.7.jar:na]
at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:380) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) [scala-library-2.11.7.jar:na]
Caused by: org.apache.flink.runtime.operators.util.CorruptConfigurationException: Could not read the user code wrapper: io.fabrick.aperture.doctor.stream.WordCount$$anon$2$$anon$1
at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:284) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobgraph.OutputFormatVertex.initializeOnMaster(OutputFormatVertex.java:60) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:1008) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$7.apply(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at scala.collection.Iterator$class.foreach(Iterator.scala:742) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1194) ~[scala-library-2.11.7.jar:na]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[scala-library-2.11.7.jar:na]
at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:996) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:380) ~[flink-runtime_2.11-1.0.0.jar:1.0.0]
Caused by: java.lang.ClassNotFoundException: io.fabrick.aperture.doctor.stream.WordCount$$anon$2$$anon$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_60]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_60]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_60]
at java.lang.Class.forName0(Native Method) ~[na:1.8.0_60]
at java.lang.Class.forName(Class.java:348) ~[na:1.8.0_60]
at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:64) ~[flink-core-1.0.0.jar:1.0.0]
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) ~[na:1.8.0_60]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) ~[na:1.8.0_60]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) ~[na:1.8.0_60]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_60]
I launch my project directly from IntelliJ (I don't know if this can be the root cause).
Here comes the code executed :
object WordCount {
def test():Unit = {
// set up the execution environment
val test = ExecutionEnvironment.getExecutionEnvironment
// get input data
val text = test.fromElements("To be, or not to be,--that is the question:--",
"Whether 'tis nobler in the mind to suffer", "The slings and arrows of outrageous fortune",
"Or to take arms against a sea of troubles,")
val counts = text.flatMap { _.toLowerCase.split("\\W+") }
.map { (_, 1) }
.groupBy(0)
.sum(1)
// emit result
counts.print()
// execute program
test.execute("WordCount Example")
}
}
Any help is welcome, thank you all !
Edit : Just to add the dependency list :
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % "1.0.0",
"org.apache.flink" %% "flink-clients" % "1.0.0",
"org.apache.flink" %% "flink-streaming-scala" % "1.0.0",
)
Make sure you have deployed your jar to all Flink instances (jobmanager + all taskmanager nodes).

installation of kie-drools-wb-distribution-wars-6.3.0.Final-wildfly8

I am trying to deploy "kie-drools-wb-distribution-wars-6.3.0.Final-wildfly8" on WildFly 9.1 in standalone mode while starting(enable ) of this am getting following stack trace. Please find the attachment of screen shot of error and server log
ERROR [org.uberfire.java.nio.file.api.FileSystemProviders] (MSC service thread 1-4) Can't initialize FileSystemProviders: java.util.ServiceConfigurationError: org.uberfire.java.nio.file.spi.FileSystemProvider: Provider org.uberfire.java.nio.fs.jgit.JGitFileSystemProvider could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.uberfire.java.nio.file.api.FileSystemProviders.buildProviders(FileSystemProviders.java:65)
at org.uberfire.java.nio.file.api.FileSystemProviders.setup(FileSystemProviders.java:48)
at org.uberfire.java.nio.file.api.FileSystemProviders.resolveProvider(FileSystemProviders.java:104)
at org.uberfire.java.nio.file.FileSystems.newFileSystem(FileSystems.java:117)
at org.uberfire.java.nio.file.FileSystems.newFileSystem(FileSystems.java:83)
at org.uberfire.io.impl.AbstractIOService.newFileSystem(AbstractIOService.java:241)
at org.uberfire.backend.server.cdi.SystemConfigProducer$2.create(SystemConfigProducer.java:252)
at org.uberfire.backend.server.cdi.SystemConfigProducer$2.create(SystemConfigProducer.java:187)
at org.jboss.weld.context.AbstractContext.get(AbstractContext.java:96)
at org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:101)
at org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:99)
at org.jboss.weld.bean.proxy.ProxyMethodHandler.invoke(ProxyMethodHandler.java:99)
at org.jboss.weld.proxies.FileSystem$1366014920$Proxy$_$$_WeldClientProxy.getPath(Unknown Source)
at org.kie.uberfire.social.activities.server.SocialUserServicesExtendedBackEndImpl.buildPath(SocialUserServicesExtendedBackEndImpl.java:33)
at org.kie.uberfire.social.activities.server.SocialUserServicesExtendedBackEndImpl$Proxy$_$$_WeldClientProxy.buildPath(Unknown Source)
at org.kie.uberfire.social.activities.persistence.SocialUserCachePersistence.<init>(SocialUserCachePersistence.java:42)
at org.kie.uberfire.social.activities.persistence.SocialUserInstancePersistence.<init>(SocialUserInstancePersistence.java:16)
at org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer.setup(SocialUserPersistenceProducer.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:98)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.postConstruct(DefaultLifecycleCallbackInvoker.java:81)
at org.jboss.weld.injection.producer.BasicInjectionTarget.postConstruct(BasicInjectionTarget.java:126)
at org.jboss.weld.bean.ManagedBean.create(ManagedBean.java:171)
at org.jboss.weld.context.AbstractContext.get(AbstractContext.java:96)
at org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:101)
at org.jboss.weld.bean.ContextualInstanceStrategy$ApplicationScopedContextualInstanceStrategy.get(ContextualInstanceStrategy.java:141)
at org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:99)
at org.jboss.weld.bean.proxy.ProxyMethodHandler.getInstance(ProxyMethodHandler.java:125)
at org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer$Proxy$_$$_WeldClientProxy.toString(Unknown Source)
at org.uberfire.backend.server.cdi.SystemConfigProducer.runPostConstruct(SystemConfigProducer.java:143)
at org.uberfire.backend.server.cdi.SystemConfigProducer.afterDeploymentValidation(SystemConfigProducer.java:126)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.jboss.weld.injection.StaticMethodInjectionPoint.invoke(StaticMethodInjectionPoint.java:88)
at org.jboss.weld.injection.MethodInvocationStrategy$SpecialParamPlusBeanManagerStrategy.invoke(MethodInvocationStrategy.java:144)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:306)
at org.jboss.weld.event.ExtensionObserverMethodImpl.sendEvent(ExtensionObserverMethodImpl.java:121)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:284)
at org.jboss.weld.event.ObserverMethodImpl.notify(ObserverMethodImpl.java:262)
at org.jboss.weld.event.ObserverNotifier.notifySyncObservers(ObserverNotifier.java:271)
at org.jboss.weld.event.ObserverNotifier.notify(ObserverNotifier.java:260)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:154)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:148)
at org.jboss.weld.bootstrap.events.AbstractContainerEvent.fire(AbstractContainerEvent.java:54)
at org.jboss.weld.bootstrap.events.AbstractDeploymentContainerEvent.fire(AbstractDeploymentContainerEvent.java:35)
at org.jboss.weld.bootstrap.events.AfterDeploymentValidationImpl.fire(AfterDeploymentValidationImpl.java:28)
at org.jboss.weld.bootstrap.WeldStartup.validateBeans(WeldStartup.java:447)
at org.jboss.weld.bootstrap.WeldBootstrap.validateBeans(WeldBootstrap.java:90)
at org.jboss.as.weld.WeldStartService.start(WeldStartService.java:94)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:1948)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1881)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.uberfire.java.nio.IOException: java.io.IOException: Failed to open server socket for /127.0.0.1:9418
at org.uberfire.java.nio.fs.jgit.JGitFileSystemProvider.buildAndStartDaemon(JGitFileSystemProvider.java:508)
at org.uberfire.java.nio.fs.jgit.JGitFileSystemProvider.<init>(JGitFileSystemProvider.java:374)
at org.uberfire.java.nio.fs.jgit.JGitFileSystemProvider.<init>(JGitFileSystemProvider.java:343)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at java.lang.Class.newInstance(Class.java:433)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 63 more
Caused by: java.io.IOException: Failed to open server socket for /127.0.0.1:9418
at org.uberfire.java.nio.fs.jgit.daemon.git.Daemon.start(Daemon.java:197)
at org.uberfire.java.nio.fs.jgit.JGitFileSystemProvider.buildAndStartDaemon(JGitFileSystemProvider.java:506)
... 71 more
Caused by: java.net.BindException: Address already in use: JVM_Bind
at java.net.TwoStacksPlainSocketImpl.socketBind(Native Method)
at java.net.TwoStacksPlainSocketImpl.socketBind(TwoStacksPlainSocketImpl.java:137)
at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:382)
at java.net.TwoStacksPlainSocketImpl.bind(TwoStacksPlainSocketImpl.java:110)
at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:190)
at java.net.ServerSocket.bind(ServerSocket.java:375)
at java.net.ServerSocket.<init>(ServerSocket.java:237)
at org.uberfire.java.nio.fs.jgit.daemon.git.Daemon.start(Daemon.java:195)
... 72 more
2015-10-10 12:22:55,911 ERROR [org.jboss.as.server] (XNIO-1 task-7) WFLYSRV0021: Deploy of deployment "kie-drools-wb-distribution-wars-6.3.0.Final-wildfly8.war" was rolled back with the following failure message:
{"WFLYCTL0080: Failed services" => {"jboss.deployment.unit.\"kie-drools-wb-distribution-wars-6.3.0.Final-wildfly8.war\".WeldStartService" => "org.jboss.msc.service.StartException in service jboss.deployment.unit.\"kie-drools-wb-distribution-wars-6.3.0.Final-wildfly8.war\".WeldStartService: Failed to start service
Caused by: org.jboss.weld.exceptions.DeploymentException: Exception List with 1 exceptions:
Exception 0 :
org.jboss.weld.exceptions.WeldException: WELD-000049: Unable to invoke public void org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer.setup() on org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer#432ba935
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:100)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.postConstruct(DefaultLifecycleCallbackInvoker.java:81)
at org.jboss.weld.injection.producer.BasicInjectionTarget.postConstruct(BasicInjectionTarget.java:126)
at org.jboss.weld.bean.ManagedBean.create(ManagedBean.java:171)
at org.jboss.weld.context.AbstractContext.get(AbstractContext.java:96)
at org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:101)
at org.jboss.weld.bean.ContextualInstanceStrategy$ApplicationScopedContextualInstanceStrategy.get(ContextualInstanceStrategy.java:141)
at org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:99)
at org.jboss.weld.bean.proxy.ProxyMethodHandler.getInstance(ProxyMethodHandler.java:125)
at org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer$Proxy$_$$_WeldClientProxy.toString(Unknown Source)
at org.uberfire.backend.server.cdi.SystemConfigProducer.runPostConstruct(SystemConfigProducer.java:143)
at org.uberfire.backend.server.cdi.SystemConfigProducer.afterDeploymentValidation(SystemConfigProducer.java:126)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.jboss.weld.injection.StaticMethodInjectionPoint.invoke(StaticMethodInjectionPoint.java:88)
at org.jboss.weld.injection.MethodInvocationStrategy$SpecialParamPlusBeanManagerStrategy.invoke(MethodInvocationStrategy.java:144)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:306)
at org.jboss.weld.event.ExtensionObserverMethodImpl.sendEvent(ExtensionObserverMethodImpl.java:121)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:284)
at org.jboss.weld.event.ObserverMethodImpl.notify(ObserverMethodImpl.java:262)
at org.jboss.weld.event.ObserverNotifier.notifySyncObservers(ObserverNotifier.java:271)
at org.jboss.weld.event.ObserverNotifier.notify(ObserverNotifier.java:260)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:154)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:148)
at org.jboss.weld.bootstrap.events.AbstractContainerEvent.fire(AbstractContainerEvent.java:54)
at org.jboss.weld.bootstrap.events.AbstractDeploymentContainerEvent.fire(AbstractDeploymentContainerEvent.java:35)
at org.jboss.weld.bootstrap.events.AfterDeploymentValidationImpl.fire(AfterDeploymentValidationImpl.java:28)
at org.jboss.weld.bootstrap.WeldStartup.validateBeans(WeldStartup.java:447)
at org.jboss.weld.bootstrap.WeldBootstrap.validateBeans(WeldBootstrap.java:90)
at org.jboss.as.weld.WeldStartService.start(WeldStartService.java:94)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:1948)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1881)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:98)
... 37 more
Caused by: java.lang.NullPointerException
at org.uberfire.java.nio.file.api.FileSystemProviders.getProvider(FileSystemProviders.java:114)
at org.uberfire.java.nio.file.api.FileSystemProviders.resolveProvider(FileSystemProviders.java:107)
at org.uberfire.java.nio.file.FileSystems.newFileSystem(FileSystems.java:117)
at org.uberfire.java.nio.file.FileSystems.newFileSystem(FileSystems.java:83)
at org.uberfire.io.impl.AbstractIOService.newFileSystem(AbstractIOService.java:241)
at org.uberfire.backend.server.cdi.SystemConfigProducer$2.create(SystemConfigProducer.java:252)
at org.uberfire.backend.server.cdi.SystemConfigProducer$2.create(SystemConfigProducer.java:187)
at org.jboss.weld.context.AbstractContext.get(AbstractContext.java:96)
at org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:101)
at org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:99)
at org.jboss.weld.bean.proxy.ProxyMethodHandler.invoke(ProxyMethodHandler.java:99)
at org.jboss.weld.proxies.FileSystem$1366014920$Proxy$_$$_WeldClientProxy.getPath(Unknown Source)
at org.kie.uberfire.social.activities.server.SocialUserServicesExtendedBackEndImpl.buildPath(SocialUserServicesExtendedBackEndImpl.java:33)
at org.kie.uberfire.social.activities.server.SocialUserServicesExtendedBackEndImpl$Proxy$_$$_WeldClientProxy.buildPath(Unknown Source)
at org.kie.uberfire.social.activities.persistence.SocialUserCachePersistence.<init>(SocialUserCachePersistence.java:42)
at org.kie.uberfire.social.activities.persistence.SocialUserInstancePersistence.<init>(SocialUserInstancePersistence.java:16)
at org.kie.uberfire.social.activities.server.SocialUserPersistenceProducer.setup(SocialUserPersistenceProducer.java:76)
... 42 more

EclipseLink throws ArrayIndexOutOfBoundsException when trying to weave a class

I've got play framework application (2.3.8) to which I provide eclipselink-2.5.1.jar agent. During startup I see in logs:
Weaver encountered an exception while trying to weave class
[one of my JPA entities class]. The exception was:
java.lang.ArrayIndexOutOfBoundsException: 30053
How can I investigate what's causing the problem?
What might be a problem?
UPDATE:
So changing logging to finest gave me a stack trace:
EL Finest]: weaver: 2015-07-16 20:52:31.163--ServerSession(1547425104)--Thread(Thread[application-akka.actor.default-dispatcher-2,5,main])--java.lang.ArrayIndexOutOfBoundsException: 25970
at org.eclipse.persistence.internal.libraries.asm.ClassReader.readClass(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.ClassReader.getInterfaces(Unknown Source)
at org.eclipse.persistence.internal.jpa.weaving.ComputeClassWriter.typeImplements(ComputeClassWriter.java:143)
at org.eclipse.persistence.internal.jpa.weaving.ComputeClassWriter.typeImplements(ComputeClassWriter.java:150)
at org.eclipse.persistence.internal.jpa.weaving.ComputeClassWriter.getCommonSuperClass(ComputeClassWriter.java:62)
at org.eclipse.persistence.internal.libraries.asm.ClassWriter.getMergedType(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.Frame.merge(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.Frame.merge(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.MethodWriter.visitMaxs(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.MethodAdapter.visitMaxs(Unknown Source)
at org.eclipse.persistence.internal.jpa.weaving.MethodWeaver.visitMaxs(MethodWeaver.java:152)
at org.eclipse.persistence.internal.libraries.asm.ClassReader.accept(Unknown Source)
at org.eclipse.persistence.internal.libraries.asm.ClassReader.accept(Unknown Source)
at org.eclipse.persistence.internal.jpa.weaving.PersistenceWeaver.transform(PersistenceWeaver.java:93)
at org.eclipse.persistence.internal.jpa.deployment.JavaSECMPInitializer$1.transform(JavaSECMPInitializer.java:228)
at sun.instrument.TransformerManager.transform(TransformerManager.java:188)
at sun.instrument.InstrumentationImpl.transform(InstrumentationImpl.java:428)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.eclipse.persistence.internal.security.PrivilegedAccessHelper.getClassForName(PrivilegedAccessHelper.java:124)
at org.eclipse.persistence.internal.jpa.metadata.listeners.EntityListenerMetadata.getClass(EntityListenerMetadata.java:224)
at org.eclipse.persistence.internal.jpa.metadata.listeners.EntityClassListenerMetadata.process(EntityClassListenerMetadata.java:81)
at org.eclipse.persistence.internal.jpa.metadata.accessors.classes.EntityAccessor.processListeners(EntityAccessor.java:1220)
at org.eclipse.persistence.internal.jpa.metadata.MetadataProcessor.addEntityListeners(MetadataProcessor.java:138)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:591)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.getAbstractSession(EntityManagerFactoryDelegate.java:204)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.createEntityManagerImpl(EntityManagerFactoryDelegate.java:304)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManagerImpl(EntityManagerFactoryImpl.java:336)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManager(EntityManagerFactoryImpl.java:302)
at play.db.jpa.DefaultJPAApi.em(DefaultJPAApi.java:71)
at play.db.jpa.DefaultJPAApi.withTransaction(DefaultJPAApi.java:123)
at play.db.jpa.JPA.withTransaction(JPA.java:159)
at play.db.jpa.TransactionalAction.call(TransactionalAction.java:16)
at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94)
at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40)
at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70)
at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:32)
at scala.concurrent.impl.Future$.apply(Future.scala:31)
at scala.concurrent.Future$.apply(Future.scala:485)
at play.core.j.JavaAction.apply(JavaAction.scala:94)
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105)
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:104)
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:103)
at scala.Option.map(Option.scala:145)
at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:103)
at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:96)
at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524)
at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524)
at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560)
at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560)
at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536)
at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
But it isn't much of a help for me.
This looks like a bug in the ASM processor being used by EclipseLink when dealing with lambda expressions. Bug https://bugs.eclipse.org/bugs/show_bug.cgi?id=429992 is fixed in EclipseLink 2.6.