The Scala application with Scala 2.11.12 is throwing following error while executing certain set of code
The environment configurations are as follow:
Scala IDE with Eclipse: version 4.7
Eclipse Version: 2019-06 (4.12.0)
Spark Version: 2.4.4
Java Version: "1.8.0_221"
However the same set of configuration is working fine in eclipse IDE with Scala version 2.11.11
Exception in thread "main" java.lang.NumberFormatException: Not a version: 9
at scala.util.PropertiesTrait$class.parts$1(Properties.scala:184)
at scala.util.PropertiesTrait$class.isJavaAtLeast(Properties.scala:187)
at scala.util.Properties$.isJavaAtLeast(Properties.scala:17)
at scala.tools.util.PathResolverBase$Calculated$.javaBootClasspath(PathResolver.scala:276)
at scala.tools.util.PathResolverBase$Calculated$.basis(PathResolver.scala:283)
at scala.tools.util.PathResolverBase$Calculated$.containers$lzycompute(PathResolver.scala:293)
at scala.tools.util.PathResolverBase$Calculated$.containers(PathResolver.scala:293)
at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:332)
at scala.tools.util.PathResolverBase.result(PathResolver.scala:314)
at scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28)
at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115)
at scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131)
at scala.tools.nsc.Global.classPath(Global.scala:128)
at scala.tools.nsc.backend.jvm.BTypesFromSymbols.<init>(BTypesFromSymbols.scala:39)
at scala.tools.nsc.backend.jvm.BCodeIdiomatic.<init>(BCodeIdiomatic.scala:24)
at scala.tools.nsc.backend.jvm.BCodeHelpers.<init>(BCodeHelpers.scala:23)
at scala.tools.nsc.backend.jvm.BCodeSkelBuilder.<init>(BCodeSkelBuilder.scala:25)
at scala.tools.nsc.backend.jvm.BCodeBodyBuilder.<init>(BCodeBodyBuilder.scala:25)
at scala.tools.nsc.backend.jvm.BCodeSyncAndTry.<init>(BCodeSyncAndTry.scala:21)
at scala.tools.nsc.backend.jvm.GenBCode.<init>(GenBCode.scala:47)
at scala.tools.nsc.Global$genBCode$.<init>(Global.scala:675)
at scala.tools.nsc.Global.genBCode$lzycompute(Global.scala:671)
at scala.tools.nsc.Global.genBCode(Global.scala:671)
at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.serialVUID(GenASM.scala:1240)
at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1329)
at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.emitFor$1(GenASM.scala:198)
at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:204)
at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1528)
at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1513)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.wrapInPackageAndCompile(ToolBoxFactory.scala:197)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.compile(ToolBoxFactory.scala:252)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$$anonfun$compile$2.apply(ToolBoxFactory.scala:429)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$$anonfun$compile$2.apply(ToolBoxFactory.scala:422)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$.liftedTree2$1(ToolBoxFactory.scala:355)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$.apply(ToolBoxFactory.scala:355)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.compile(ToolBoxFactory.scala:422)
at com.slb.itdataplatform.dq.DataQualityValidation$$anonfun$compile$1.apply(DataQualityValidation.scala:112)
at scala.util.Try$.apply(Try.scala:192)
at com.slb.itdataplatform.dq.DataQualityValidation$.compile(DataQualityValidation.scala:109)
at com.slb.itdataplatform.dq.DataQualityValidation$.generateVerifier(DataQualityValidation.scala:104)
at com.slb.itdataplatform.dq.DataQualityValidation$.main(DataQualityValidation.scala:49)
at com.slb.itdataplatform.dq.DataQualityValidation.main(DataQualityValidation.scala)
I can work on the same set of environment configuration but Spark 2.4.4 underlined Scala version is 2.11.12 hence I want to use the same in my application to avoid any conflicts.(As my spark apps are not initializing Unable to initialize Spark job)
What could be the possible root cause for this error and how it can be resolved?
Related
I am trying to run a Flink (v 1.13.1) application on EMR ( v
5.34.0).
My Flink application uses Scallop(v 4.1.0) to parse the arguments passed.
Scala version used for Flink application is
2.12.7.
I keep getting below error when I submit the flink application to the cluster. Any clue or help is highly appreciated.
java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at org.rogach.scallop.Scallop.<init>(Scallop.scala:63)
at org.rogach.scallop.Scallop$.apply(Scallop.scala:13)
Resolved the issue by downgrading Scala to 2.11. Flink 1.13.1 Scala shell REPL on EMR mentioned scala version 2.11.12 hence downgraded to that version of Scala and this problem has disappeared.
I'm upgrading from Spark 1.6 to 2.1 version (HortonWorks Distribution). Below explaining Stage 1 and Stage 2 scenario, Stage 1 with successful execution and Stage 2 failing.
Stage 1
POM xml dependencies for Spark 1.6.3 (which is working absolutely fine) are,
scala tools version 2.10
scala version 2.10.5
scala compiler version 2.10.6
spark-core_2.10
spark-sql_2.10
spark version 1.6.3
There is common set of libraries used which are -
commons-csv-1.4.jar
commons-configuration2-2.1.1.jar
commons-beanutils-1.9.2.jar
commons-email-1.4.jar
javax.mail-1.5.2.jar
sqoop-1.4.6.2.3.0.12-7.jar
avro-mapred-1.8.2.jar
avro-1.8.2.jar
guava-14.0.jar
commons-logging-1.1.3.jar
jackson-module-scala_2.10-2.4.4.jar
jackson-databind-2.4.4.jar
jackson-core-2.4.4.jar
xdb6.jar
jackson-mapper-asl-1.9.13.jar
ojdbc7-12.1.0.2.jar
Stage 2
As I change the dependencies and spark version in POM.xml to -
scala tools version 2.11
scala version 2.11.8
scala compiler version 2.11.8
spark-core_2.11
spark-sql_2.11
spark version 2.1.0
Also, the from the common set of libraries, I'm just changing -
jackson-module-scala_2.11-2.6.5.jar
jackson-core-2.6.5.jar
jackson-databind-2.6.5.jar
While, I take a build and try to run it on the cluster with Spark configuration as 2.1 and scala as 2.11.8, it is failing with below error.
INFO: Exception in thread "main" java.lang.IncompatibleClassChangeError:
class com.xxx.xxx.xxx.DQListener has interface
org.apache.spark.scheduler.SparkListener as super class
I'm sure that the issue is with importing correct jars but not able to find out which one. I would appreciate if anybody could help to resolve this please.
i got this error. I'm not sure why this is the case because there is a coalesce method in org.apache.spark.rdd.RDD.
Any ideas?
Am I running a incompatible version of Spark and org.apache.spark.rdd.RDD?
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.coalesce$default$3(IZ)Lscala/math/Ordering;
It was because some part of your code or project dependencies called old version(spark version before 2.0.0) spark API 'coalesce' while in new version spark this API has been removed and replaced by 'repartition'.
To fix this problem, you could either downgrade your spark run environment to below version 2.0.0, or you can upgrade your SDK spark version to above 2.0.0 and upgrade project dependencies version to be compatible with spark 2.0.0 or above.
For more details please see this thread:
https://github.com/twitter/algebird/issues/549
https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31
As I suspected, this is a library compatibility issue. Everything works (no code change) after downgrading Spark alone.
Before:
scala 2.11.8
spark 2.0.1
Java 1.8.0_92
After
scala 2.11.8
spark 1.6.2
Java 1.8.0_92
OS: OSX 10.11.6
I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file)
There is no problem in starting the server, uploading context/ app jars. But at runtime, the following errors occur
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror
Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? talks about using Scala 10 to compile the sources. Is it true?
Any suggestions please...
Use scala 2.10.4 to compile your project. Otherwise you need to compile spark with 11 too.
I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar)
I am able to run the program fine in sbt.
I tried to run it from the command line as follows:
time ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar java -cp /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar com.dtex.analysis.transform.GenUserSummaryView -d /Users/arun/DataSets/LME -p output -s txt -o /Users/arun/tmp/LME/LME
I get the following error message:
Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at org.rogach.scallop.package$.(package.scala:37) at
org.rogach.scallop.package$.(package.scala) at
com.dtex.analysis.transform.GenUserSummaryView$Conf.delayedEndpoint$com$dtex$analysis$transform$GenUserSummaryView$Conf$1(GenUserSummaryView.scala:27)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf$delayedInit$body.apply(GenUserSummaryView.scala:26)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at org.rogach.scallop.AfterInit$class.delayedInit(AfterInit.scala:12)
at org.rogach.scallop.ScallopConf.delayedInit(ScallopConf.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf.(GenUserSummaryView.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:54)
at
com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)
The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.
Move everything to Scala 2.10 version and make sure you update your SBT as well.
You may also try to compile Spark sources for Scala 2.11.7 and use it instead.
I was also encountered with the same issue with spark-submit, in my case:
Spark Job was compiled with : Scala 2.10.8
Scala version with which Spark was compiled on the cluster: Scala 2.11.8
To check the Spark version and Scala version on the cluster use "spark-shell" command.
After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.