NoClassDefFound when compiling Scala with Java sources - scala

I'm trying to compile a project with Scala and Java sources from the command line, using the following two commands:
scalac -d . *.scala *.java
javac -d . -cp /usr/local/Cellar/scala/2.11.6/libexec/lib/scala-library.jar:. *.java
The compile goes fine, however when I try to run the code I get the following exception.
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Seq
at Agent.main(Agent.java:62)
Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
For what it's worth, compiling and running through IntelliJ works fine. Any ideas on what's happening?

Related

Solving Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream

I'm using Jetbrains IntelliJ IDEA with the Scala plugin and I'm trying to execute some code that uses Apache Spark. However whenever I try to run it, the code doesn't execute properly because of the exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at KMeans$.main(kmeans.scala:71)
at KMeans.main(kmeans.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Running spark-shell from terminal doesn't give me any problems, the warning unable to load native-hadoop library for your platform doesn't appear to me.
I've read some questions similar to mine, but in those cases they had problems with spark-shell or with cluster configuration.
I was using spark-core_2.12-2.4.3.jar without the dependencies. I solved the issue by adding spark-core library through Maven, which automatically added all the dependencies.

NoSuchMethodError from dependencies when using spark-submit

I'm trying to submit a JAR to my Apache Spark 2.2.1 cluster using Scala 2.11. I included some extra dependencies in my JAR, namely Apache Commons CLI, and packaged it all into a fat JAR. However, I'm getting a NoSuchMethodError when I submit my Spark application. I'm quite sure that it's not due to inconsistency in Scala versions, but rather something weird with dependencies.
The command is simply spark-submit myjar.jar [arguments]
This is the error:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.commons.cli.Options.addRequiredOption(Ljava/lang/String;Ljava/lang/String;ZLjava/lang/
String;)Lorg/apache/commons/cli/Options;
at xyz.plenglin.aurum.spark.RunOnSpark$.main(RunOnSpark.scala:46)
at xyz.plenglin.aurum.spark.RunOnSpark.main(RunOnSpark.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Running java -jar myjar.jar [arguments] works without any issues. Peeking inside the JAR, I see org.apache.commons.cli.Options where it should be.
It looks like I fixed it by adding the --driver-class-path argument to my spark-submit command.
So the command looks like:
$ spark-submit --driver-class-path myjar.jar myjar.jar [arguments]

How to configure titan over hbase in java eclipse?

using the following commands as given in the tutorial :
http://s3.thinkaurelius.com/docs/titan/0.5.0/hbase.html
TitanGraph graph = TitanFactory.build()
.set("storage.backend","hbase")
.open();
used the maven dependency :
<dependency>
<groupId>com.thinkaurelius.titan</groupId>
<artifactId>titan-hbase</artifactId>
<version>${titan.version}</version>
</dependency>
Following error is shown
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MasterNotRunningException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:42)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:479)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:413)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1320)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:94)
at com.thinkaurelius.titan.core.TitanFactory$Builder.open(TitanFactory.java:135)
at pluradj.titan.tinkerpop3.example.JavaExample2.main(JavaExample2.java:26)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.MasterNotRunningException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
If possible can you tell the same for cassandra also.
Look here at how to set your runtime classpath in Eclipse: How do I set the runtime classpath in Eclipse 4.2?
It appears your runtime classpath is missing jars due to the NoClassDefFoundError exception. For this specific error, locate the hbase lib directory from your hbase install and add that to your classpath.
If you use Cassandra, you'll need to set your classpath appropriately for Cassandra.
Titan's HBase client config accepts arbitrary keys from hbase-site.xml
(if it's on the CLASSPATH) and I recommend putting that in your path as well.

PlayFramework 2.3.1 NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream

I'm studying Play Framework 2.x and I have installed the framework, with the activator 1.2.3, on my Fedora FC20. Now I'm facing a strange error on launching the activator new, that never happened before. I've tried with java-1.7.0-openjdk-1.7.0.65-2.5.1.2.fc20.i386 and also with openjedk 1.8.0, but this error is still here.
java.lang.NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream
at jline.console.ConsoleReader.stripAnsi(ConsoleReader.java:479)
at jline.console.ConsoleReader.setPrompt(ConsoleReader.java:398)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2172)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2126)
at sbt.JLine.sbt$JLine$$readLineDirectRaw(LineReader.scala:45)
at sbt.JLine$$anonfun$readLineDirect$2.apply(LineReader.scala:37)
at sbt.JLine$$anonfun$readLineDirect$2.apply(LineReader.scala:37)
at sbt.Signals0.withHandler(Signal.scala:87)
at sbt.Signals$.withHandler(Signal.scala:13)
at sbt.JLine.readLineDirect(LineReader.scala:37)
at sbt.JLine.readLineWithHistory(LineReader.scala:32)
at sbt.JLine.sbt$JLine$$unsynchronizedReadLine(LineReader.scala:20)
at sbt.JLine$$anonfun$readLine$1.apply(LineReader.scala:17)
at sbt.JLine$$anonfun$readLine$1.apply(LineReader.scala:17)
at sbt.JLine$$anonfun$withJLine$1.apply(LineReader.scala:118)
at sbt.JLine$$anonfun$withJLine$1.apply(LineReader.scala:116)
at sbt.JLine$.withTerminal(LineReader.scala:92)
at sbt.JLine$.withJLine(LineReader.scala:116)
at sbt.JLine.readLine(LineReader.scala:17)
at activator.ActivatorCliHelper$class.readLine(ActivatorCliHelper.scala:19)
at activator.TemplateHandler$.readLine(TemplateHandler.scala:16)
at activator.TemplateHandler$.getTemplateName(TemplateHandler.scala:81)
at activator.ActivatorCli$$anonfun$apply$1.getTemplateName$1(ActivatorCli.scala:55)
at activator.ActivatorCli$$anonfun$apply$1.apply$mcI$sp(ActivatorCli.scala:89)
at activator.ActivatorCli$$anonfun$apply$1.apply(ActivatorCli.scala:19)
at activator.ActivatorCli$$anonfun$apply$1.apply(ActivatorCli.scala:19)
at activator.ActivatorCli$.withContextClassloader(ActivatorCli.scala:179)
at activator.ActivatorCli$.apply(ActivatorCli.scala:19)
at activator.ActivatorLauncher.run(ActivatorLauncher.scala:28)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:129)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:36)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:19)
at xsbt.boot.Boot$.runImpl(Boot.scala:44)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Caused by: java.lang.ClassNotFoundException: org.fusesource.jansi.AnsiOutputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 38 more
Error during sbt execution: java.lang.NoClassDefFoundError: org/fusesource/jansi/AnsiOutputStream
You can try to delete and recreate your local repo (not quite sure which one applies to you):
~/.m2/repository
~/.ivy2/cache
~/.ivy/cache
Also, there was a play clean command before they switched to the activator. There should be something like activator clean now. After this you can try with activator compile.
Edit: as #sentenza pointed out, removing ~/.sbt was the correct step which solved the problem. I will still leave the other options above as they might work for somebody else.

Spark ClassNotFoundException running the master

I have downloaded and built Spark 0.80 using sbt/sbt assembly. It was successful. However when running ./bin/start-master.sh the following error is seen in the log file
Spark Command: /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp :/shared/spark-0.8.0-incubating-bin-hadoop1/conf:/shared/spark-0.8.0-incubating-bin-hadoop1/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar
/shared/spark-0.8.0-incubating-bin-hadoop1/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip mellyrn.local --port 7077 --webui-port 8080
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/deploy/master/Master
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.master.Master
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Update: after doing sbt clean (per suggestion below) it is running: see screenshot.
There can be a number of things which cause this error which are not specific to Spark:
Bad build, sbt clean compile that puppy again.
You have a cached dependency in your .ivy2 cache which conflicts with a dependency of that project version of Spark. Empty your cache and try again.
Your project which is building on Spark has a library version which conflicts with a dependency of Spark. That is, Spark may dependency on "foo-0.9.7" while your project put in "foo-0.8.4".
Try looking at those first.