ClassNotFoundException: scala.Function1$mcLL$sp - scala

Hope someone can help me :)
I'm playing around with Rowz which I'm busy changing to use the newest Scala and Sbt (to help me evaluate Rowz in my environment and also just as a learning exercise). Now I'm getting the following error which I'm struggling to resolve:
Starting rowz (it's kinda quiet at the moment)
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Function1$mcLL$sp
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:787)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:447)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476$$anon$2$$anon$11.<init>((inline):48)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476$$anon$2.<init>((inline):48)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476.apply((inline):38)
at Evaluator__92455c9cf893b1375b64dc2cae2905dd9718fe77_409351476.apply((inline):1)
at com.twitter.util.Eval.applyProcessed(Eval.scala:197)
at com.twitter.util.Eval.applyProcessed(Eval.scala:189)
at com.twitter.util.Eval.apply(Eval.scala:135)
at com.twitter.util.Eval.apply(Eval.scala:169)
at com.twitter.rowz.Main$.main(Main.scala:16)
at com.twitter.rowz.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: scala.Function1$mcLL$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 22 more
Any ideas?
I'm using the latest Scala (currently 2.10.1) and Sbt (currently 0.12.3)
The stacktrace mentions Twitter's util-eval project. I have the latest version. In my Sbt file:
libraryDependencies += "com.twitter" %% "util-eval" % "[6.2.4,)"
And this then retrieves:
/lib_managed/jars/com.twitter/util-core_2.10/util-core_2.10-6.3.0.jar
/lib_managed/jars/com.twitter/util-eval_2.10/util-eval_2.10-6.3.0.jar
/lib_managed/jars/org.scala-lang/scala-reflect/scala-reflect-2.10.1.jar
The file in question: Eval class on Twitter's github
As the latter is subject to change the code in question is as follows with the error occurring on the last line:
**
* same as apply[T], but does not run preprocessors.
*/
def applyProcessed[T](className: String, code: String, resetState: Boolean): T = {
val cls = compiler(wrapCodeInClass(className, code), className, resetState)
cls.getConstructor().newInstance().asInstanceOf[() => Any].apply().asInstanceOf[T]
}
Any insights appreciated.

Grepping the scala-library jar, it seems that Function1$mcLL$sp existed in 2.8.2, and disappeared 2.9.x.
It is an internal class representing Function1 with the apply method specialized for long.
More importantly, this means you have some code that was compiled against 2.8.x in your dependencies.
You should go through all your dependencies and make sure they all target 2.10.x.

Do not use specialization. It's broken. I have seen these kind of errors repeatedly, after I already published libraries that were successfully compiled with specialization (which is enabled by default).
Until specialization is thoroughly fixed, I recommend to compile all projects with
-no-specialization

Related

java.lang.NoClassDefFoundError: org/apache/spark/sq/sources/v2/StreamingWriteSupportProvider trying to pull from kafka topic in scala

I'm using a spark-shell instance to test the pulling of data from a client's kafka source. To launch the instance I am using the command spark-shell --jars spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar (all jars are present in the woring dir).
However, when I run the command val df = spark.read.format("kafka")........... after a few seconds it crashes with the below:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamingWriteSupportProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:304)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamingWriteSupportProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more
HOWEVER - if I change the order of the jars in the spark-shell command to spark-shell --jars kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar, spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, instead crashes with:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<init>(KafkaSourceProvider.scala:376)
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<clinit>(KafkaSourceProvider.scala)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.validateBatchOptions(KafkaSourceProvider.scala:330)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:113)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
I am developing behind a very strict proxy managed by our client and am unable to user --packages instead, and I am at a bit of a loss here, am I unable to load all 3 dependencies at the launch of the shell? Am I missing another step somewhere?
In the Structured Streaming + Kafka Integration Guide it says:
For experimenting on spark-shell, you need to add this above library and its dependencies too when invoking spark-shell.
The library you are using seems to be customized and not publicly available in the maven central repository. That means, I can not look into its dependencies.
However, looking at the latest stable version 2.4.5 the dependencies according to maven central repository is kafka-clients version 2.0.0.
You are trying to import multiple scala versions 2.11 & 2.12 of different libraries.
Please add same version of scala libraries & check below how to import into spark-shell.
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.5,org.apache.kafka:kafka_2.11:2.4.1,org.apache.kafka:kafka-clients:2.4.1
One occasionally disruptive issue is dealing with dependency conflicts in cases where a user application and Spark itself both depend on the same library. This comes up relatively rarely, but when it does, it can be vexing for users. Typically, this will manifest itself when a NoSuchMethodError, a ClassNotFoundException, or some other JVM exception related to class loading is thrown during the execution of a Spark job. There are two solutions to this problem. The first is to modify your application to depend on the same version of the third-party library that Spark does. The second is to modify the packaging of your application using a procedure that is often called “shading.” The Maven build tool supports shading through advanced configuration of the plug-in shown in Example 7-5 (in fact, the shading capability is why the plugin is named maven-shade-plugin). Shading allows you to make a second copy of the conflicting package under a different namespace and rewrites your application’s code to use the renamed version. This somewhat brute-force technique is quite effective at resolving runtime dependency conflicts. For specific instructions on how to shade dependencies, see the documentation for your build tool.
I would try to know the scala version of the spark-shell because, it can be a scala version issue
scala> util.Properties.versionString
res3: String = version 2.11.8
if not, then check what spark version you are using and third-party library versions you are using as dependencies because, I am sure there is newest or oldest that your spark version doesn't support.
I hope it helps.

Mapreduce using Scala Error: java.lang.ClassNotFoundException: scala.Predef$

I tried to implement a simple mapreduce job through scala. However, when I run the package using the command,
hadoop jar hadoop.jar mapreduce.MaxTemperature hdfs://sandbox/user/ajay/input hdfs://sandbox/user/ajay/output
I get the error,
16/09/06 16:06:12 INFO mapreduce.Job: Task Id : attempt_1473177830264_0002_m_000001_2, Status : FAILED Error: java.lang.ClassNotFoundException: scala.Predef$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at mapreduce.MaxTemperatureMapper.map(MaxTemperatureMapper.scala:17)
at mapreduce.MaxTemperatureMapper.map(MaxTemperatureMapper.scala:9)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Even though I've added the scala-library to my class path, I get the above error.
hadoop version: Hadoop 2.7.1.2.3.0.0-255
scala version: 2.11.8
java version 1.7.0_85
Any suggestion is appreciated.
Apart from adding the scala library to the client's classpath, it has to be added to all nodes where the task gets executed. This can be achieved using the ToolRunner hadoop jar scala-2.11/hadoop_2.11-0.1.0.jar mapreduce.WordCount -libjars /usr/lib/scala-2.11.8/lib/scala-library.jar

Creating EPStatement in Esper by using Eclipse

I have the maven plug in installed on my Eclipse and added Esper jar to my library.
I tried to ran a simple example with Esper but I failed on the following code:
EPServiceProvider epService = EPServiceProviderManager.getDefaultProvider();
String expression = "select avg(price) from org.myapp.event.OrderEvent.win:time(30 sec)";
EPStatement statement = epService.getEPAdministrator().createEPL(expression);
And I got exception at createEPL(expression):
Exception in thread "main" java.lang.NoClassDefFoundError: org/antlr/v4/runtime/tree/Tree
at com.espertech.esper.core.service.EPAdministratorHelper.<clinit> (EPAdministratorHelper.java:43)
at com.espertech.esper.core.service.EPAdministratorImpl.createEPLStmt(EPAdministratorImpl.java:116)
at com.espertech.esper.core.service.EPAdministratorImpl.createEPL(EPAdministratorImpl.java:66)
at Main.main(Main.java:57)
Caused by: java.lang.ClassNotFoundException: org.antlr.v4.runtime.tree.Tree
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 4 more
It seems like that esper failed to find org/antlr/v4/runtime but I am pretty sure that this package is in the library.
Did I miss anything to run the code?
The jar file that is not in your classpath is antlr-runtime-4.1.jar, a required Esper dependency.

ClassNotFoundException when running integration tests with SBT when a library tries to access a class from another one

We have a complex Scala project using many different libraries from the Scala / Java world. We use lift-json for generating Json output from our data source. Sometimes, we need to cache some part of it to memcached. We use spymemcached for this. In some cases, we cache sequences of lift-json JObject in Memcached directly, without using any other abstraction over it.
This works great:
In production
When running integration tests from out IDE (IntelliJ)
On our staging environment
However, when running our integration tests using sbt it:test, We get the following exception from Spymemcached:
2013-03-07 13:26:04.393 WARN net.spy.memcached.transcoders.SerializingTranscoder: Caught CNFE decoding 4074 bytes of data
java.lang.ClassNotFoundException: net.liftweb.json.JsonAST$JObject
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:266)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:622)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1593)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at scala.collection.immutable.$colon$colon.readObject(List.scala:404)
at sun.reflect.GeneratedMethodAccessor44.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1872)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at net.spy.memcached.transcoders.BaseSerializingTranscoder.deserialize(BaseSerializingTranscoder.java:100)
at net.spy.memcached.transcoders.SerializingTranscoder.decode(SerializingTranscoder.java:66)
at net.spy.memcached.transcoders.TranscodeService$1.call(TranscodeService.java:42)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at net.spy.memcached.transcoders.TranscodeService$Task.run(TranscodeService.java:89)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Our tests are still running, so this is a low-priority bug. But we want to have our tests run in a environment as close as possible to production (where this actually works). I would like spymemcached to know about the existence of lift-json every time.
This looks like a classpath issue, but I have a hard-time finding exactly how to find this issue.

NoClassDefFoundError => ClassPath$JavaContext when using play start

I've made a little Scala, Play2.0.2 application.
It works fine when i use play run command, but when i use play start or play clean compile stage + target/start, when trying to do a MongoDB insertion with Casbah/Salat, i get the following stack:
[info] application - Can't create user
java.lang.NoClassDefFoundError: scala/tools/nsc/util/ClassPath$JavaContext
at scala.tools.scalap.scalax.rules.scalasig.ScalaSigParser$.scalaSigFromAttribute(ScalaSig.scala:35) ~[scalap-2.9.1.jar:na]
at scala.tools.scalap.scalax.rules.scalasig.ScalaSigParser$.parse(ScalaSig.scala:38) ~[scalap-2.9.1.jar:na]
at com.novus.salat.util.ScalaSigUtil$$anonfun$parseScalaSig0$2.apply(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
at com.novus.salat.util.ScalaSigUtil$$anonfun$parseScalaSig0$2.apply(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
at scala.Option.map(Option.scala:133) ~[scala-library.jar:na]
at com.novus.salat.util.ScalaSigUtil$.parseScalaSig0(ScalaSigUtil.scala:73) ~[salat-util_2.9.1-0.0.8-SNAPSHOT.jar:0.0.8-SNAPSHOT]
Caused by: java.lang.ClassNotFoundException: scala.tools.nsc.util.ClassPath$JavaContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) ~[na:1.7.0_01]
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) ~[na:1.7.0_01]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_01]
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[na:1.7.0_01]
at java.lang.ClassLoader.loadClass(ClassLoader.java:423) ~[na:1.7.0_01]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) ~[na:1.7.0_01]
Any idea?
It seems the Scala compiler is needed by Scalap so i added the dependency:
"org.scala-lang" % "scala-compiler" % "2.9.1"
And it works fine.
Which leads me to another question:
Why do i need Scala compiler at runtime? (Play2/Salat with Scalap dependency)
The problem could also occur if you miss the Async support configuration when using Spring MVC in the context.xml