Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? - scala

I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar)
I am able to run the program fine in sbt.
I tried to run it from the command line as follows:
time ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar java -cp /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar com.dtex.analysis.transform.GenUserSummaryView -d /Users/arun/DataSets/LME -p output -s txt -o /Users/arun/tmp/LME/LME
I get the following error message:
Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at org.rogach.scallop.package$.(package.scala:37) at
org.rogach.scallop.package$.(package.scala) at
com.dtex.analysis.transform.GenUserSummaryView$Conf.delayedEndpoint$com$dtex$analysis$transform$GenUserSummaryView$Conf$1(GenUserSummaryView.scala:27)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf$delayedInit$body.apply(GenUserSummaryView.scala:26)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at org.rogach.scallop.AfterInit$class.delayedInit(AfterInit.scala:12)
at org.rogach.scallop.ScallopConf.delayedInit(ScallopConf.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf.(GenUserSummaryView.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:54)
at
com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)

The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.
Move everything to Scala 2.10 version and make sure you update your SBT as well.
You may also try to compile Spark sources for Scala 2.11.7 and use it instead.

I was also encountered with the same issue with spark-submit, in my case:
Spark Job was compiled with : Scala 2.10.8
Scala version with which Spark was compiled on the cluster: Scala 2.11.8
To check the Spark version and Scala version on the cluster use "spark-shell" command.
After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.

Related

How to enable Partial Unification in Spark REPL with Scala 2.11.8?

I have Scala code written in Scala 2.11.12 using the partial-unification compiler option, which I would like to run in a Spark 2.2.2 REPL.
With a Spark version compiled against Scala 2.11.12 (i.e. 2.3+), this is possible in the Spark REPL via :settings -Ypartial-unification, and the code executes.
I want to run this on Spark 2.2.2, which is compiled against Scala 2.11.8.
To do this, I have downloaded the jar with the partial unification compiler plugin (source from: https://github.com/milessabin/si2712fix-plugin), which backports this setting.
I've played around with a Scala 2.11.8 REPL (adding jar to the classpath - seems too rudimentary) and haven't managed to get it working there (before trying to add it to Spark), and am asking if anyone knows how to do this or if adding a compiler setting to a REPL via a JAR is not possible.
Any other advice appreciated!

Upgrading from Spark 1.6 to 2.1 - Incompatible Class Change Error

I'm upgrading from Spark 1.6 to 2.1 version (HortonWorks Distribution). Below explaining Stage 1 and Stage 2 scenario, Stage 1 with successful execution and Stage 2 failing.
Stage 1
POM xml dependencies for Spark 1.6.3 (which is working absolutely fine) are,
scala tools version 2.10
scala version 2.10.5
scala compiler version 2.10.6
spark-core_2.10
spark-sql_2.10
spark version 1.6.3
There is common set of libraries used which are -
commons-csv-1.4.jar
commons-configuration2-2.1.1.jar
commons-beanutils-1.9.2.jar
commons-email-1.4.jar
javax.mail-1.5.2.jar
sqoop-1.4.6.2.3.0.12-7.jar
avro-mapred-1.8.2.jar
avro-1.8.2.jar
guava-14.0.jar
commons-logging-1.1.3.jar
jackson-module-scala_2.10-2.4.4.jar
jackson-databind-2.4.4.jar
jackson-core-2.4.4.jar
xdb6.jar
jackson-mapper-asl-1.9.13.jar
ojdbc7-12.1.0.2.jar
Stage 2
As I change the dependencies and spark version in POM.xml to -
scala tools version 2.11
scala version 2.11.8
scala compiler version 2.11.8
spark-core_2.11
spark-sql_2.11
spark version 2.1.0
Also, the from the common set of libraries, I'm just changing -
jackson-module-scala_2.11-2.6.5.jar
jackson-core-2.6.5.jar
jackson-databind-2.6.5.jar
While, I take a build and try to run it on the cluster with Spark configuration as 2.1 and scala as 2.11.8, it is failing with below error.
INFO: Exception in thread "main" java.lang.IncompatibleClassChangeError:
class com.xxx.xxx.xxx.DQListener has interface
org.apache.spark.scheduler.SparkListener as super class
I'm sure that the issue is with importing correct jars but not able to find out which one. I would appreciate if anybody could help to resolve this please.

spark-submit on standalone cluster complain about scala-2.10 jars not exist

I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)
When submitting my scala (2.11.8) uber jar the cluster throw and error:
java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built
I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10
Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?
Note sure what is wrong with the pre-built spark-2.1.0 but I've just downloaded spark 2.2.0 and it is working great.
Try setting SPARK_HOME="location to your spark installation" on your system or IDE

Scala Runtime errors calling program on Spark Job Server

I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file)
There is no problem in starting the server, uploading context/ app jars. But at runtime, the following errors occur
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror
Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? talks about using Scala 10 to compile the sources. Is it true?
Any suggestions please...
Use scala 2.10.4 to compile your project. Otherwise you need to compile spark with 11 too.

Akka cluster error

Is it mandatory to use sbt for cluster usage in akka. I have tried to add a few jars to the classpath. While the compiling goes well, running the relevant class generates an error.
scala -cp
../akka-2.2.1/lib/akka/akka-cluster_2.10-2.2.1.jar:../akka-2.2.1/lib/akka/netty-3.6.6.Final.jar:../akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:../akka-2.2.1/lib/akka/protobuf-java-2.4.1.jar:./
TransformationFrontend 2551
here's the issue encountered:
java.lang.NoSuchMethodException:
akka.cluster.ClusterActorRefProvider.(java.lang.String,
akka.actor.ActorSystem$Settings, akka.event.EventStream,
akka.actor.Scheduler, akka.actor.DynamicAccess) at
java.lang.Class.getConstructor0(Class.java:2800) at
java.lang.Class.getDeclaredConstructor(Class.java:2043)
This is the official Akka cluster example. Can someone throw some light on my query?
The issue here is probably that you have an akka-actor.jar in your scala distributuion that is Akka 2.1.x and you're trying to use Akka 2.2.x.
You'll have to run your code by running the java command and add the scala-library.jar and the correct akka-actor.jar and typesafe-config.jar to the classpath.
Are you using Scala 2.10? This is the Scala version you need for Akka 2.2.
What does the following yield?
scala -version
It should show something like
$ scala -version
Scala code runner version 2.10.3 -- Copyright 2002-2013, LAMP/EPFL