Getting exception : java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;) while using data frames - scala

I am receiving "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)" error while using dataframes in scala app and running it using spark. However if I work using only RDD's and not dataframes, no such error comes up with same pom and settings. Also while going through other posts with same error, it is mentioned that scala version has to be 2.10 as spark is not compatible with 2.11 scala, and i am using 2.10 scala version with 2.0.0 spark.
Below is the snip from pom:
<properties>
<spark-assembly>/usr/lib/spark/lib/spark-assembly.jar</spark-assembly>
<encoding>UTF-8</encoding>
<hadoop.version>2.7.1</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<scala.version>2.10.5</scala.version>
<scala.tools.version>2.10</scala.tools.version>
<spark.version>2.0.0</spark.version>
<phoenix.version>4.7.0-HBase-1.1</phoenix.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
Error:
16/10/19 02:57:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.abc.xyz.Compare$.main(Compare.scala:64)
at com.abc.xyz.Compare.main(Compare.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/19 02:57:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;)
16/10/19 02:57:26 INFO spark.SparkContext: Invoking stop() from shutdown hook

Change scala version
<scala.version>2.11.8</scala.version>
<scala.tools.version>2.11</scala.tools.version>
and add
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>

I also faced this error and this is purely a version issue.
Your scala version is not compatible or may be you are using the correct version but the intellij libraries has the old version.
Quick fix :
I as using spark 2.2.0 and scala 2.10.4 , which I then changed to scala version 2.11.8.After that do the below:
1) right click on intellij module
2) open module-settings.
3) go to libraries and clear all of them
4) Rebuild
Doing above for me issue is resolved.

Related

Embedded Kafka & Spark 2.3 version mismatch issue

When I use this dependency:
<dependency>
<groupId>net.manub</groupId>
<artifactId>scalatest-embedded-kafka_2.11</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
With
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.3.0</version>
</dependency>
I run into this error:
Cause: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.reader.SupportsScanUnsafeRow
Trying to figure out which version of 'scalatest-embedded-kafka' will work with Spark 2.3.
Any ideas?

Using OrientDB 3.0.0m1 java API with TinkerPop 3

I am attempting to issue Tinker Pop 3 [Orient-DB 3.0 snapshot] requests via the Java API. I am using OCommandGremlin as follows [should match 2 Vs]:
>
OGremlinHelper.global().create();
OCommandRequest req = graph.command(new OCommandGremlin("g.V().has('name', 'fast').both()"));
Iterable<Vertex> result2 = req.execute();
....
It seems to be looking for TinkerPop 2.x class com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine
I get the following error:
WARNING: $ANSI{green {db=demodb}} GREMLIN language not available (not in classpath)
Exception in thread "main" java.lang.NoClassDefFoundError: com/tinkerpop/gremlin/groovy/jsr223/GremlinGroovyScriptEngine
at com.orientechnologies.orient.graph.gremlin.OGremlinEngineThreadLocal.get(OGremlinEngineThreadLocal.java:61)
at com.orientechnologies.orient.graph.gremlin.OGremlinHelper.getGremlinEngine(OGremlinHelper.java:165)
at com.orientechnologies.orient.graph.gremlin.OGremlinHelper.execute(OGremlinHelper.java:83)
at com.orientechnologies.orient.graph.gremlin.OGremlinHelper.execute(OGremlinHelper.java:75)
at com.orientechnologies.orient.graph.gremlin.OCommandGremlinExecutor.execute(OCommandGremlinExecutor.java:59)
at com.orientechnologies.orient.core.storage.impl.local.OAbstractPaginatedStorage.executeCommand(OAbstractPaginatedStorage.java:2480)
at com.orientechnologies.orient.core.storage.impl.local.OAbstractPaginatedStorage.command(OAbstractPaginatedStorage.java:2425)
at com.orientechnologies.orient.core.command.OCommandRequestTextAbstract.execute(OCommandRequestTextAbstract.java:68)
at com.tinkerpop.blueprints.impls.orient.OrientGraphCommand.execute(OrientGraphCommand.java:49)
at BasicGremlinDriver.main(BasicGremlinDriver.java:202)
Caused by: java.lang.ClassNotFoundException: com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
My maven file is as follows:
<dependencies>
<dependency>
<groupId>com.orientechnologies</groupId>
<artifactId>orientdb-graphdb</artifactId>
<version>3.0.0m1</version>
</dependency>
<dependency>
<groupId>com.orientechnologies</groupId>
<artifactId>orientdb-spatial</artifactId>
<version>3.0.0m1</version>
</dependency>
<dependency>
<groupId>com.orientechnologies</groupId>
<artifactId>orientdb-lucene</artifactId>
<version>3.0.0m1</version>
</dependency>
<dependency>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>gremlin-core</artifactId>
<version>3.2.4</version>
</dependency>
<dependency>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>tinkergraph-gremlin</artifactId>
<version>3.2.4</version>
</dependency>
<dependency>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>gremlin-groovy</artifactId>
<version>3.2.4</version>
</dependency>
....
I also set $/META-INF/services/javax.script.ScriptEngineFactory to org.apache.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngineFactory which is the Tinker Pop 3 version and I can find it on my class path.
Feedback appreciated.
Thanks
JGZ
If you want to play it Apache Gremlin, get rid of
<dependency>
<groupId>com.orientechnologies</groupId>
<artifactId>orientdb-graphdb</artifactId>
<version>3.0.0m1</version>
</dependency>
and add:
<dependency>
<groupId>com.orientechnologies</groupId>
<artifactId>orientdb-gremlin</artifactId>
<version>3.0.0m1</version>
</dependency>
Orientdb graph binds to ThinkerPop 2.6: it is for backward compatibility. The support for Apache Gremlin 3.x is provided by the new artifact. PAy attention to packages name, the new package is:
org.apache.tinkerpop.gremlin.orientdb
note that in 3.0 we provide a native multimodel API that allows to works with graph without additional modules:
http://orientdb.com/docs/3.0.x/java/Java-MultiModel-API.html
Try this:
<dependency>
<groupId>com.tinkerpop.gremlin</groupId>
<artifactId>gremlin-groovy</artifactId>
<version>2.6.0</version>
</dependency>
Hope it helps.
Regards

NullPointerException in Salat

While making any type of call to Mongo from my Scala application, I am getting this NullPointerException. Can somebody please help.
I am using Mongo 3.0.1 and my Scala version is 2.9.0. Other dependencies are as follows
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>casbah_2.9.1</artifactId>
<type>pom</type>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.11.1</version>
</dependency>
<dependency>
<groupId>com.novus</groupId>
<artifactId>salat-core_2.9.1</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
<version>0.99</version>
</dependency>
Error :
Caused by: java.lang.NullPointerException
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at scala.Option.map(Option.scala:134)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at scala.Option.flatMap(Option.scala:147)
at com.novus.salat.util.GraterPrettyPrinter$class.safeDefault(PrettyPrinters.scala:74)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.safeDefault(PrettyPrinters.scala:108)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:134)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:128)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.apply(PrettyPrinters.scala:128)
at com.novus.salat.util.ToObjectGlitch.<init>(ToObjectGlitch.scala:44)
at com.novus.salat.ConcreteGrater.feedArgsToConstructor(Grater.scala:294)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:263)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:105)
at com.novus.salat.dao.SalatMongoCursorBase$class.next(SalatMongoCursor.scala:47)
at com.novus.salat.dao.SalatMongoCursor.next(SalatMongoCursor.scala:149)
at scala.collection.Iterator$class.foreach(Iterator.scala:652)
at com.novus.salat.dao.SalatMongoCursor.foreach(SalatMongoCursor.scala:149)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:128)
at scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:242)
at com.novus.salat.dao.SalatMongoCursor.toList(SalatMongoCursor.scala:149)
This issue was due to corrupt data in the db. After clearing that it worked.

Mysterious X509HostnameVerifier dependency

While extending a previously working project, I seemed to have muffed a maven dependency.
junit snippet:
Client interimClient = ClientBuilder.newClient();
WebTarget interim = interimClient.target(REST_TARGET_URL);
result persistedResult = interim.request()
.post(Entity.entity(testResult, MediaType.APPLICATION_JSON), Result.class);
Assert.assertEquals("A result should be persisted ", "TEST", persistedResult.getId());
Error:
java.lang.NoClassDefFoundError: org/apache/http/conn/ssl/X509HostnameVerifier
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at javax.ws.rs.client.FactoryFinder.newInstance(FactoryFinder.java:116)
at javax.ws.rs.client.FactoryFinder.find(FactoryFinder.java:164)
at javax.ws.rs.client.ClientBuilder.newBuilder(ClientBuilder.java:86)
I tried adding the dependency
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.0-alpha4</version>
</dependency>
...but then got
java.lang.NoClassDefFoundError: org/apache/http/conn/scheme/SchemeSocketFactory
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
Given that this was working just a couple hours earlier, following each successive dependency error seems like sliding down the rabbit hole. Hopefully this is a known jumping-off point that someone can help direct me on. tiy.
We were getting a similar error message when running JUnit tests. Our dependency versions are coming from org.wildfly:wildfly-parent:10.0.0.Final.
The initial error was:
java.lang.RuntimeException: java.lang.ClassNotFoundException: org.glassfish.jersey.client.JerseyClientBuilder
Adding the following dependency resolved the initial error
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<scope>provided</scope>
</dependency>
We then received this second error:
java.lang.NoClassDefFoundError: org/apache/http/conn/ssl/X509HostnameVerifier
Adding the following dependency resolved the second (X509HostnameVerifier) error
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<scope>provided</scope>
</dependency>
Once these two dependencies were added the problem was resolved. The resteasy-client resolves to version 3.0.14.Final and the httpclient resolves to version 4.5 for org.wildfly:wildfly-parent:10.0.0.Final.
It appears that my smattering of jax-rs related dependencies somehow mutated into causing this error. I was able to get back into good standing after whittling them down into just the following:
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
</dependency>
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson-provider</artifactId>
<scope>test</scope>
</dependency>

Spark MLLIB error: java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate

I am trying to run the Linear Regression example in the Spark job. I got the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate$default$4(Ljava/lang/Object;)I
at org.apache.spark.mllib.optimization.GradientDescent$$anonfun$runMiniBatchSGD$1.apply$mcVI$sp(GradientDescent.scala:189)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.mllib.optimization.GradientDescent$.runMiniBatchSGD(GradientDescent.scala:184)
at org.apache.spark.mllib.optimization.GradientDescent.optimize(GradientDescent.scala:107)
at org.apache.spark.mllib.regression.GeneralizedLinearAlgorithm.run(GeneralizedLinearAlgorithm.scala:267)
at org.apache.spark.mllib.regression.GeneralizedLinearAlgorithm.run(GeneralizedLinearAlgorithm.scala:190)
at com.myproject.sample.LinearRegression$.run(LinearRegressionExample.scala:105)
at com.myproject.sample.LinearRegression$$anonfun$main$1.apply(LinearRegressionExample.scala:67)
at com.myproject.sample.LinearRegression$$anonfun$main$1.apply(LinearRegressionExample.scala:66)
at scala.Option.map(Option.scala:145)
at com.myproject.sample.LinearRegression$.main(LinearRegressionExample.scala:66)
at com.myproject.sample.LinearRegression.main(LinearRegressionExample.scala)
Here is my dependencies in pom:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.0</version>
<exclusions>
:
:
</dependency>
Why did I get java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate ? Did I use the wrong dependency? Thanks!