I was using phantom to connect to cassandra database from my scala code. It worked before. But today after I upgraded to the latest 1.26.1, it throws the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at scala.reflect.internal.util.WeakHashSet.<init>(WeakHashSet.scala:19)
at scala.reflect.internal.util.WeakHashSet$.apply(WeakHashSet.scala:429)
at scala.reflect.internal.SymbolTable$perRunCaches$.<init>(SymbolTable.scala:310)
at scala.reflect.internal.SymbolTable.perRunCaches$lzycompute(SymbolTable.scala:304)
at scala.reflect.internal.SymbolTable.perRunCaches(SymbolTable.scala:304)
at scala.reflect.internal.Symbols$class.$init$(Symbols.scala:71)
at scala.reflect.internal.SymbolTable.<init>(SymbolTable.scala:13)
at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:12)
at scala.reflect.runtime.package$.universe$lzycompute(package.scala:16)
at scala.reflect.runtime.package$.universe(package.scala:16)
at com.websudos.diesel.engine.reflection.EarlyInit$class.$init$(EarlyInit.scala:13)
at com.websudos.phantom.db.DatabaseImpl.<init>(DatabaseImpl.scala:42)
at data.MyDatabase.<init>(Users.scala:87)
I'm using scala 2.11.7. Strange enough that downgraded to the older version the issue remains. I know it must be something else. But I couldn't figure it out. Any help?
Related
I've got the problem when run my spark-jdbc job to connect to another db. But I've got error before.
Exception in thread "main" java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
My Logger wesn't able to be initialized by scala.
I'm using scala 2.11 and spark with the same versions.
Can't debug this issue via IDE, cause there all is fine, but when I run spark-submit, then error happens.
Got the same error while using Spark 2.3 and I should've been using Spark 2.2. Apparently that method was made abstract in the later version so I was getting that error.
Try this
After upgrading version 0.2.2 of google-maps-services, trying to do the following:
GeoApiContext context = new GeoApiContext.Builder().apiKey("MY KEY").build();
I am getting the error detailed below:
Exception in thread "main" java.lang.NoClassDefFoundError: okhttp3/OkHttpClient$Builder
Hope someone can help.
Thanks,
Amir
There was a reference to an older version of okhttp in another project I had included. Took 2 hours by I got this to work.
I am using cassandra 3.2.1 with spark, i included all the required jars. and i tried to connect cassandra from java through spark, i am getting the following error,
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lscala/collection/immutable/StringOps;
at akka.util.Duration$.(Duration.scala:76)
at akka.util.Duration$.(Duration.scala)
at akka.actor.ActorSystem$Settings.(ActorSystem.scala:120)
at akka.actor.ActorSystemImpl.(ActorSystem.scala:426)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:103)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:98)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:142)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.(SparkContext.scala:272)
at spark.Sample.run(Sample.java:13)
at spark.Sample.main(Sample.java:23)
Any Idea regarding this? and what i am missing.
See the jars and my sample code in below image. Don't know where i am doing mistake.
Click here to open image
Sometimes when updating a dependency version for my project, when I would run the project I would get the following error:
ERROR 09:31:34:241 apply$mcV$sp - Class could not be loaded and/or registered: scala.Enumeration$Val
ERROR 09:31:34:247 apply$mcV$sp - exception caught during akka-kryo-serialization startup: java.lang.ClassNotFoundException: scala.Enumeration$Val
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
I usually got lucky and was able to fix this by tweaking the versions of my dependencies.
Until I updated my akka version from 2.3.6 to 2.3.12, specifically akka-contrib which contains akka-cluster, akka-remote and akka-peristance.
Turns out that the problem was actually occurring during the transition between akka 2.3.8 and 2.3.9. Looking at the change log, I found that between these two versions the Scala version was upgraded from 2.11.4 to 2.11.5.
My project was building on scala 2.11.4. Upgrading to 2.11.5 fixed the problem.
I couldn't find any help online for this, and lost a couple days trying to figure this out. I'm hoping that this will help someone else out.
I recently downgraded to Scala 2.8, and now whenever I try to initialise an actor, I get the following error message:
java.lang.NoSuchMethodError: scala.actors.ReactorCanReply$class.$init$(Lscala/actors/ReactorCanReply;)V
Apparently this guy had the same problem, but no solution was found. Has anyone else encountered this and solved it?
I thought that maybe there was some sonfusion going on in the background between Scala 2.8 and 2.9 files, so I've tried uninstalling and reinstalling both Scala and Eclipse, deleting all my binaries and rebuilding, and even creating a new Eclipse project and copying my source files in, but the problem persists.
My stack trace:
Exception in thread "main" java.lang.NoSuchMethodError: scala.actors.ReactorCanReply$class.$init$(Lscala/actors/ReactorCanReply;)V
at uk.mike.blackjack.PlayerReceiver.<init>(PlayerReceiver.scala:11)
at uk.mike.blackjack.Blackjack$.main(Blackjack.scala:141)
at uk.mike.blackjack.Blackjack.main(Blackjack.scala)
The Java bytecode compiled from Scala is not backward-compatible. You must recompile all your scala file and their dependencies when you upgrade to any new Scala version before Scala 2.10 releases.
BTW: Scala 2.10 promises to keep backward-compatible between all 2.10.x versions in the future.