Getting java.lang.NoClassDefFoundError: scala/xml/MetaData error in Scala Program - scala

I am trying to save data to avro file using spark in scala but getting below Exception
Exception in thread "main" java.lang.NoClassDefFoundError: scala/xml/MetaData
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:33)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:443)
at tavant.user.UserData$.main(UserData.scala:18)
at tavant.user.UserData.main(UserData.scala)
Caused by: java.lang.ClassNotFoundException: scala.xml.MetaData
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 7 more
Below is my pom.xml dependencies
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
I have tried the other solution like adding dependency to my pom.xml file, Below is the dependency I tried:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
But getting below Exception
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:44)
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:443)
at tavant.user.UserData$.main(UserData.scala:18)
at tavant.user.UserData.main(UserData.scala)
Any Idea??
Thanks in advance

Related

java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder - Scala Spark

I am trying to execute an WordCount example with Scala Spark but I am having the following problem:
java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder
at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3232)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3286)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3254)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:478)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
at es.santander.prueba.pruebas.WordCount.count(WordCount.scala:123)
at es.santander.prueba.pruebas.WordCount.main(WordCount.scala:50)
at es.santander.prueba.pruebas.Main$.main(WordCount.scala:27)
at es.santander.prueba.pruebas.Main.main(WordCount.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 11 more
I have been searching to fix this problem and I add to my pom.xml the following jars:
<!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<version>2.4.5</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.htrace/htrace-core -->
<dependency>
<groupId>org.htrace</groupId>
<artifactId>htrace-core</artifactId>
<version>3.0.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.htrace/htrace-hbase -->
<dependency>
<groupId>org.apache.htrace</groupId>
<artifactId>htrace-hbase</artifactId>
<version>4.1.0-incubating</version>
</dependency>
But the problem persist, Am I doing wrong something or do I need something more?
EDIT: This is where I am having the error:
FileSystem.get(spark.sparkContext.hadoopConfiguration).delete(new Path(outputFile), true)

ClassNotFoundException: xsbt/WeakLog scala intellij

I'm having trouble Running a scala code on Intellij with windows 7.
I have installed Intellij with scala 2.12 now I am creating a simple mvn application with hello world example as below:
object HelloWorld {
def main(args: Array[String]): Unit = {
println ("hello")
}
}
While running the same on intellij I am getting the error as below: Would be good if someone can help me for same.
Error:scalac: Error: xsbt/WeakLog
java.lang.NoClassDefFoundError: xsbt/WeakLog
at xsbt.CompilerInterface.newCompiler(CompilerInterface.scala:19)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Caused by: java.lang.ClassNotFoundException: xsbt.WeakLog
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 20 more
Below is the pom:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.2</version>
</dependency>
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s-core_2.12</artifactId>
<version>5.3.2</version>
</dependency>
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s-tcp_2.12</artifactId>
<version>5.3.2</version>
</dependency>
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s-http_2.12</artifactId>
<version>5.3.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>
</dependencies>

Getting exception : java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;) while using data frames

I am receiving "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)" error while using dataframes in scala app and running it using spark. However if I work using only RDD's and not dataframes, no such error comes up with same pom and settings. Also while going through other posts with same error, it is mentioned that scala version has to be 2.10 as spark is not compatible with 2.11 scala, and i am using 2.10 scala version with 2.0.0 spark.
Below is the snip from pom:
<properties>
<spark-assembly>/usr/lib/spark/lib/spark-assembly.jar</spark-assembly>
<encoding>UTF-8</encoding>
<hadoop.version>2.7.1</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<scala.version>2.10.5</scala.version>
<scala.tools.version>2.10</scala.tools.version>
<spark.version>2.0.0</spark.version>
<phoenix.version>4.7.0-HBase-1.1</phoenix.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
Error:
16/10/19 02:57:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.abc.xyz.Compare$.main(Compare.scala:64)
at com.abc.xyz.Compare.main(Compare.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/19 02:57:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;)
16/10/19 02:57:26 INFO spark.SparkContext: Invoking stop() from shutdown hook
Change scala version
<scala.version>2.11.8</scala.version>
<scala.tools.version>2.11</scala.tools.version>
and add
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
I also faced this error and this is purely a version issue.
Your scala version is not compatible or may be you are using the correct version but the intellij libraries has the old version.
Quick fix :
I as using spark 2.2.0 and scala 2.10.4 , which I then changed to scala version 2.11.8.After that do the below:
1) right click on intellij module
2) open module-settings.
3) go to libraries and clear all of them
4) Rebuild
Doing above for me issue is resolved.

Spark MLLIB error: java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate

I am trying to run the Linear Regression example in the Spark job. I got the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate$default$4(Ljava/lang/Object;)I
at org.apache.spark.mllib.optimization.GradientDescent$$anonfun$runMiniBatchSGD$1.apply$mcVI$sp(GradientDescent.scala:189)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.mllib.optimization.GradientDescent$.runMiniBatchSGD(GradientDescent.scala:184)
at org.apache.spark.mllib.optimization.GradientDescent.optimize(GradientDescent.scala:107)
at org.apache.spark.mllib.regression.GeneralizedLinearAlgorithm.run(GeneralizedLinearAlgorithm.scala:267)
at org.apache.spark.mllib.regression.GeneralizedLinearAlgorithm.run(GeneralizedLinearAlgorithm.scala:190)
at com.myproject.sample.LinearRegression$.run(LinearRegressionExample.scala:105)
at com.myproject.sample.LinearRegression$$anonfun$main$1.apply(LinearRegressionExample.scala:67)
at com.myproject.sample.LinearRegression$$anonfun$main$1.apply(LinearRegressionExample.scala:66)
at scala.Option.map(Option.scala:145)
at com.myproject.sample.LinearRegression$.main(LinearRegressionExample.scala:66)
at com.myproject.sample.LinearRegression.main(LinearRegressionExample.scala)
Here is my dependencies in pom:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.0</version>
<exclusions>
:
:
</dependency>
Why did I get java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.treeAggregate ? Did I use the wrong dependency? Thanks!

java.lang.NoClassDefFoundError: grizzled/slf4j/Logger

I am doing a project in scala and getting the following error when i run the build jar. It is working fine in Intellij Idea
This is a maven project and have included the following dependencies for logging
First i tried with scala-logging and ended up with same sort of error.
Then tried with grizzled same error occurred.
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-io</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-routing</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-httpx</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-json_${scala.base}</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-client</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.base}</artifactId>
<version>3.2.11</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_${scala.base}</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>com.typesafe.slick</groupId>
<artifactId>slick_${scala.base}</artifactId>
<version>${slick.version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>${logback.version}</version>
</dependency>
<dependency>
<groupId>c3p0</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.1.2</version>
</dependency>
<dependency>
<groupId>postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.1-901.jdbc4</version>
</dependency>
<dependency>
<groupId>org.clapper</groupId>
<artifactId>grizzled-slf4j_${scala.base}</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.9.4</version>
</dependency>
Error occurred is as follows
Exception in thread "main" java.lang.NoClassDefFoundError: grizzled/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Unknown Source)
at java.lang.Class.privateGetMethodRecursive(Unknown Source)
at java.lang.Class.getMethod0(Unknown Source)
at java.lang.Class.getMethod(Unknown Source)
at sun.launcher.LauncherHelper.validateMainClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)
Caused by: java.lang.ClassNotFoundException: grizzled.slf4j.Logger
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
Please help in sorting out the issue.
Thanks,
Rahul
Errors like this can occur when the resulting dependency tree contains different versions of the same artifact.
In your case:
<groupId>org.clapper</groupId>
<artifactId>grizzled-slf4j_2.10</artifactId>
<version>1.0.2</version>
depends on
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.7</version>
but you, in addition, use 1.7.10.
So try to change slf4j-api version from 1.7.10 to 1.7.7