PlayFramework 2.3.1 NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream - scala

I'm studying Play Framework 2.x and I have installed the framework, with the activator 1.2.3, on my Fedora FC20. Now I'm facing a strange error on launching the activator new, that never happened before. I've tried with java-1.7.0-openjdk-1.7.0.65-2.5.1.2.fc20.i386 and also with openjedk 1.8.0, but this error is still here.
java.lang.NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream
at jline.console.ConsoleReader.stripAnsi(ConsoleReader.java:479)
at jline.console.ConsoleReader.setPrompt(ConsoleReader.java:398)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2172)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2126)
at sbt.JLine.sbt$JLine$$readLineDirectRaw(LineReader.scala:45)
at sbt.JLine$$anonfun$readLineDirect$2.apply(LineReader.scala:37)
at sbt.JLine$$anonfun$readLineDirect$2.apply(LineReader.scala:37)
at sbt.Signals0.withHandler(Signal.scala:87)
at sbt.Signals$.withHandler(Signal.scala:13)
at sbt.JLine.readLineDirect(LineReader.scala:37)
at sbt.JLine.readLineWithHistory(LineReader.scala:32)
at sbt.JLine.sbt$JLine$$unsynchronizedReadLine(LineReader.scala:20)
at sbt.JLine$$anonfun$readLine$1.apply(LineReader.scala:17)
at sbt.JLine$$anonfun$readLine$1.apply(LineReader.scala:17)
at sbt.JLine$$anonfun$withJLine$1.apply(LineReader.scala:118)
at sbt.JLine$$anonfun$withJLine$1.apply(LineReader.scala:116)
at sbt.JLine$.withTerminal(LineReader.scala:92)
at sbt.JLine$.withJLine(LineReader.scala:116)
at sbt.JLine.readLine(LineReader.scala:17)
at activator.ActivatorCliHelper$class.readLine(ActivatorCliHelper.scala:19)
at activator.TemplateHandler$.readLine(TemplateHandler.scala:16)
at activator.TemplateHandler$.getTemplateName(TemplateHandler.scala:81)
at activator.ActivatorCli$$anonfun$apply$1.getTemplateName$1(ActivatorCli.scala:55)
at activator.ActivatorCli$$anonfun$apply$1.apply$mcI$sp(ActivatorCli.scala:89)
at activator.ActivatorCli$$anonfun$apply$1.apply(ActivatorCli.scala:19)
at activator.ActivatorCli$$anonfun$apply$1.apply(ActivatorCli.scala:19)
at activator.ActivatorCli$.withContextClassloader(ActivatorCli.scala:179)
at activator.ActivatorCli$.apply(ActivatorCli.scala:19)
at activator.ActivatorLauncher.run(ActivatorLauncher.scala:28)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:129)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:36)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:19)
at xsbt.boot.Boot$.runImpl(Boot.scala:44)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Caused by: java.lang.ClassNotFoundException: org.fusesource.jansi.AnsiOutputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 38 more
Error during sbt execution: java.lang.NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream

You can try to delete and recreate your local repo (not quite sure which one applies to you):
~/.m2/repository
~/.ivy2/cache
~/.ivy/cache
Also, there was a play clean command before they switched to the activator. There should be something like activator clean now. After this you can try with activator compile.
Edit: as #sentenza pointed out, removing ~/.sbt was the correct step which solved the problem. I will still leave the other options above as they might work for somebody else.

Related

java.lang.NoClassDefFoundError: org/apache/spark/sq/sources/v2/StreamingWriteSupportProvider trying to pull from kafka topic in scala

I'm using a spark-shell instance to test the pulling of data from a client's kafka source. To launch the instance I am using the command spark-shell --jars spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar (all jars are present in the woring dir).
However, when I run the command val df = spark.read.format("kafka")........... after a few seconds it crashes with the below:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamingWriteSupportProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:304)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamingWriteSupportProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more
HOWEVER - if I change the order of the jars in the spark-shell command to spark-shell --jars kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar, spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, instead crashes with:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<init>(KafkaSourceProvider.scala:376)
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<clinit>(KafkaSourceProvider.scala)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.validateBatchOptions(KafkaSourceProvider.scala:330)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:113)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
I am developing behind a very strict proxy managed by our client and am unable to user --packages instead, and I am at a bit of a loss here, am I unable to load all 3 dependencies at the launch of the shell? Am I missing another step somewhere?
In the Structured Streaming + Kafka Integration Guide it says:
For experimenting on spark-shell, you need to add this above library and its dependencies too when invoking spark-shell.
The library you are using seems to be customized and not publicly available in the maven central repository. That means, I can not look into its dependencies.
However, looking at the latest stable version 2.4.5 the dependencies according to maven central repository is kafka-clients version 2.0.0.
You are trying to import multiple scala versions 2.11 & 2.12 of different libraries.
Please add same version of scala libraries & check below how to import into spark-shell.
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.5,org.apache.kafka:kafka_2.11:2.4.1,org.apache.kafka:kafka-clients:2.4.1
One occasionally disruptive issue is dealing with dependency conflicts in cases where a user application and Spark itself both depend on the same library. This comes up relatively rarely, but when it does, it can be vexing for users. Typically, this will manifest itself when a NoSuchMethodError, a ClassNotFoundException, or some other JVM exception related to class loading is thrown during the execution of a Spark job. There are two solutions to this problem. The first is to modify your application to depend on the same version of the third-party library that Spark does. The second is to modify the packaging of your application using a procedure that is often called “shading.” The Maven build tool supports shading through advanced configuration of the plug-in shown in Example 7-5 (in fact, the shading capability is why the plugin is named maven-shade-plugin). Shading allows you to make a second copy of the conflicting package under a different namespace and rewrites your application’s code to use the renamed version. This somewhat brute-force technique is quite effective at resolving runtime dependency conflicts. For specific instructions on how to shade dependencies, see the documentation for your build tool.
I would try to know the scala version of the spark-shell because, it can be a scala version issue
scala> util.Properties.versionString
res3: String = version 2.11.8
if not, then check what spark version you are using and third-party library versions you are using as dependencies because, I am sure there is newest or oldest that your spark version doesn't support.
I hope it helps.

Solving Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream

I'm using Jetbrains IntelliJ IDEA with the Scala plugin and I'm trying to execute some code that uses Apache Spark. However whenever I try to run it, the code doesn't execute properly because of the exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at KMeans$.main(kmeans.scala:71)
at KMeans.main(kmeans.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Running spark-shell from terminal doesn't give me any problems, the warning unable to load native-hadoop library for your platform doesn't appear to me.
I've read some questions similar to mine, but in those cases they had problems with spark-shell or with cluster configuration.
I was using spark-core_2.12-2.4.3.jar without the dependencies. I solved the issue by adding spark-core library through Maven, which automatically added all the dependencies.

ClassNotFoundException: scala.Function1$mcLL$sp

Hope someone can help me :)
I'm playing around with Rowz which I'm busy changing to use the newest Scala and Sbt (to help me evaluate Rowz in my environment and also just as a learning exercise). Now I'm getting the following error which I'm struggling to resolve:
Starting rowz (it's kinda quiet at the moment)
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Function1$mcLL$sp
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:787)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:447)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476$$anon$2$$anon$11.<init>((inline):48)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476$$anon$2.<init>((inline):48)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476.apply((inline):38)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476.apply((inline):1)
at com.twitter.util.Eval.applyProcessed(Eval.scala:197)
at com.twitter.util.Eval.applyProcessed(Eval.scala:189)
at com.twitter.util.Eval.apply(Eval.scala:135)
at com.twitter.util.Eval.apply(Eval.scala:169)
at com.twitter.rowz.Main$.main(Main.scala:16)
at com.twitter.rowz.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: scala.Function1$mcLL$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 22 more
Any ideas?
I'm using the latest Scala (currently 2.10.1) and Sbt (currently 0.12.3)
The stacktrace mentions Twitter's util-eval project. I have the latest version. In my Sbt file:
libraryDependencies += "com.twitter" %% "util-eval" % "[6.2.4,)"
And this then retrieves:
/lib_managed/jars/com.twitter/util-core_2.10/util-core_2.10-6.3.0.jar
/lib_managed/jars/com.twitter/util-eval_2.10/util-eval_2.10-6.3.0.jar
/lib_managed/jars/org.scala-lang/scala-reflect/scala-reflect-2.10.1.jar
The file in question: Eval class on Twitter's github
As the latter is subject to change the code in question is as follows with the error occurring on the last line:
**
* same as apply[T], but does not run preprocessors.
*/
def applyProcessed[T](className: String, code: String, resetState: Boolean): T = {
val cls = compiler(wrapCodeInClass(className, code), className, resetState)
cls.getConstructor().newInstance().asInstanceOf[() => Any].apply().asInstanceOf[T]
}
Any insights appreciated.
Grepping the scala-library jar, it seems that Function1$mcLL$sp existed in 2.8.2, and disappeared 2.9.x.
It is an internal class representing Function1 with the apply method specialized for long.
More importantly, this means you have some code that was compiled against 2.8.x in your dependencies.
You should go through all your dependencies and make sure they all target 2.10.x.
Do not use specialization. It's broken. I have seen these kind of errors repeatedly, after I already published libraries that were successfully compiled with specialization (which is enabled by default).
Until specialization is thoroughly fixed, I recommend to compile all projects with
-no-specialization

Intellij 12 and internal compilation error when building scala project

I have just downloaded Intellij 12.01 (build #IC-123.94) and once I try to build a Scala project I get the following stacktrace:
Internal error: (java.lang.ClassNotFoundException) org.jetbrains.jps.incremental.BinaryContent
java.lang.ClassNotFoundException: org.jetbrains.jps.incremental.BinaryContent
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at org.jetbrains.jps.incremental.scala.LazyCompiledClass.<init>(ScalaBuilder.scala:239)
at org.jetbrains.jps.incremental.scala.IdeClient.generated(ScalaBuilder.scala:230)
at org.jetbrains.jps.incremental.scala.remote.ClientEventProcessor.process(ClientEventProcessor.scala:17)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$.liftedTree1$1(RemoteServer.scala:76)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$.org$jetbrains$jps$incremental$scala$remote$RemoteServer$$handle(RemoteServer.scala:74)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1$$anonfun$apply$1$$anonfun$apply$3.apply(RemoteServer.scala:44)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1$$anonfun$apply$1$$anonfun$apply$3.apply(RemoteServer.scala:43)
at org.jetbrains.jps.incremental.scala.package$.using(package.scala:15)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1$$anonfun$apply$1.apply(RemoteServer.scala:43)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1$$anonfun$apply$1.apply(RemoteServer.scala:40)
at org.jetbrains.jps.incremental.scala.package$.using(package.scala:15)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1.apply(RemoteServer.scala:40)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer$$anonfun$send$1.apply(RemoteServer.scala:39)
at org.jetbrains.jps.incremental.scala.package$.using(package.scala:15)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer.send(RemoteServer.scala:39)
at org.jetbrains.jps.incremental.scala.remote.RemoteServer.compile(RemoteServer.scala:24)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5$$anonfun$apply$3$$anonfun$apply$4.apply(ScalaBuilder.scala:110)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5$$anonfun$apply$3$$anonfun$apply$4.apply(ScalaBuilder.scala:100)
at scala.util.Either$RightProjection.map(Either.scala:536)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5$$anonfun$apply$3.apply(ScalaBuilder.scala:100)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5$$anonfun$apply$3.apply(ScalaBuilder.scala:99)
at scala.util.Either$RightProjection.flatMap(Either.scala:523)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5.apply(ScalaBuilder.scala:99)
at org.jetbrains.jps.incremental.scala.ScalaBuilder$$anonfun$5.apply(ScalaBuilder.scala:98)
at scala.util.Either$RightProjection.flatMap(Either.scala:523)
at org.jetbrains.jps.incremental.scala.ScalaBuilder.doBuild(ScalaBuilder.scala:98)
at org.jetbrains.jps.incremental.scala.ScalaBuilder.build(ScalaBuilder.scala:67)
at org.jetbrains.jps.incremental.scala.ScalaBuilderService$ScalaBuilderDecorator.build(ScalaBuilderService.java:42)
at org.jetbrains.jps.incremental.IncProjectBuilder.runModuleLevelBuilders(IncProjectBuilder.java:963)
at org.jetbrains.jps.incremental.IncProjectBuilder.runBuildersForChunk(IncProjectBuilder.java:710)
at org.jetbrains.jps.incremental.IncProjectBuilder.buildTargetsChunk(IncProjectBuilder.java:740)
at org.jetbrains.jps.incremental.IncProjectBuilder.buildChunkIfAffected(IncProjectBuilder.java:673)
at org.jetbrains.jps.incremental.IncProjectBuilder.buildChunks(IncProjectBuilder.java:494)
at org.jetbrains.jps.incremental.IncProjectBuilder.runBuild(IncProjectBuilder.java:274)
at org.jetbrains.jps.incremental.IncProjectBuilder.build(IncProjectBuilder.java:164)
at org.jetbrains.jps.cmdline.BuildRunner.runBuild(BuildRunner.java:114)
at org.jetbrains.jps.cmdline.BuildSession.runBuild(BuildSession.java:205)
at org.jetbrains.jps.cmdline.BuildSession.run(BuildSession.java:102)
at org.jetbrains.jps.cmdline.BuildMain$MyMessageHandler$1.run(BuildMain.java:107)
at org.jetbrains.jps.service.impl.SharedThreadPoolImpl$1.run(SharedThreadPoolImpl.java:26)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Apparently, the problem comes from the openapi.jar inside the Intellij's lib folder, which was supposed to contain such a class.
Any pointers?
Regards.
Had the same issue with Idea 12, but after updating the Scala plugin to version 0.7.62, problem has been solved for me.
However, you can try to follow Idea's advice:
In case of any compilation problems you may enable the previous (internal) compiler by clearing:
Project Settings / Compiler / Use external build
Same issue for me too, fixed by updating to nightly builds (0.7.82), instructions available here: http://confluence.jetbrains.net/display/SCA/Scala+Plugin+Nightly+Builds+for+Leda

NoClassDefFoundError => ClassPath$JavaContext when using play start

I've made a little Scala, Play2.0.2 application.
It works fine when i use play run command, but when i use play start or play clean compile stage + target/start, when trying to do a MongoDB insertion with Casbah/Salat, i get the following stack:
[info] application - Can't create user
java.lang.NoClassDefFoundError: scala/tools/nsc/util/ClassPath$JavaContext
at scala.tools.scalap.scalax.rules.scalasig.ScalaSigParser$.scalaSigFromAttribute(ScalaSig.scala:35) ~[scalap-2.9.1.jar:na]
at scala.tools.scalap.scalax.rules.scalasig.ScalaSigParser$.parse(ScalaSig.scala:38) ~[scalap-2.9.1.jar:na]
at com.novus.salat.util.ScalaSigUtil$$anonfun$parseScalaSig0$2.apply(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
at com.novus.salat.util.ScalaSigUtil$$anonfun$parseScalaSig0$2.apply(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
at scala.Option.map(Option.scala:133) ~[scala-library.jar:na]
at com.novus.salat.util.ScalaSigUtil$.parseScalaSig0(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
Caused by: java.lang.ClassNotFoundException: scala.tools.nsc.util.ClassPath$JavaContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) ~[na:1.7.0_01]
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) ~[na:1.7.0_01]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_01]
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[na:1.7.0_01]
at java.lang.ClassLoader.loadClass(ClassLoader.java:423) ~[na:1.7.0_01]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) ~[na:1.7.0_01]
Any idea?
It seems the Scala compiler is needed by Scalap so i added the dependency:
"org.scala-lang" % "scala-compiler" % "2.9.1"
And it works fine.
Which leads me to another question:
Why do i need Scala compiler at runtime? (Play2/Salat with Scalap dependency)
The problem could also occur if you miss the Async support configuration when using Spring MVC in the context.xml