Scala - spark-corenlp - java.lang.ClassNotFoundException - scala

I want to run spark-coreNLP example, but I get an java.lang.ClassNotFoundException error when running spark-submit.
Here is the scala code, from the github example, which I put into an object, and defined a SparkContext.
analyzer.Sentiment.scala:
package analyzer
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql.functions._
import com.databricks.spark.corenlp.functions._
import sqlContext.implicits._
object Sentiment {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Sentiment")
val sc = new SparkContext(conf)
val input = Seq(
(1, "<xml>Stanford University is located in California. It is a great university.</xml>")
).toDF("id", "text")
val output = input
.select(cleanxml('text).as('doc))
.select(explode(ssplit('doc)).as('sen))
.select('sen, tokenize('sen).as('words), ner('sen).as('nerTags), sentiment('sen).as('sentiment))
output.show(truncate = false)
}
}
I am using the build.sbt provided by spark-coreNLP - I only modified the scalaVersion and sparkVerison to my own.
version := "1.0"
scalaVersion := "2.11.8"
initialize := {
val _ = initialize.value
val required = VersionNumber("1.8")
val current = VersionNumber(sys.props("java.specification.version"))
assert(VersionNumber.Strict.isCompatible(current, required), s"Java $required required.")
}
sparkVersion := "1.5.2"
// change the value below to change the directory where your zip artifact will be created
spDistDirectory := target.value
sparkComponents += "mllib"
spName := "databricks/spark-corenlp"
licenses := Seq("GPL-3.0" -> url("http://opensource.org/licenses/GPL-3.0"))
resolvers += Resolver.mavenLocal
libraryDependencies ++= Seq(
"edu.stanford.nlp" % "stanford-corenlp" % "3.6.0",
"edu.stanford.nlp" % "stanford-corenlp" % "3.6.0" classifier "models",
"com.google.protobuf" % "protobuf-java" % "2.6.1"
)
Then, I created my jar by running without issues.
sbt package
Finally, I submit my job to Spark:
spark-submit --class "analyzer.Sentiment" --master local[4] target/scala-2.11/sentimentanalizer_2.11-0.1-SNAPSHOT.jar
But I get the following error:
java.lang.ClassNotFoundException: analyzer.Sentiment
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:641)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
My file Sentiment.scala is correclty located in a package named "analyzer".
$ find .
./src
./src/analyzer
./src/analyzer/Sentiment.scala
./src/com
./src/com/databricks
./src/com/databricks/spark
./src/com/databricks/spark/corenlp
./src/com/databricks/spark/corenlp/CoreNLP.scala
./src/com/databricks/spark/corenlp/functions.scala
./src/com/databricks/spark/corenlp/StanfordCoreNLPWrapper.scala
When I ran the SimpleApp example from the Spark Quick Start , I noticed that MySimpleProject/bin/ contained a SimpleApp.class. MySentimentProject/bin is empty. So I have tried to clean my project (I am using Eclipse for Scala).
I think it is because I need to generate Sentiment.class, but I don't know how to do it - It was done automatically with SimpleApp.scala, and when it ry to run/build with Eclipse Scala, it crashes.

Maybe You should try to add
scalaSource in Compile := baseDirectory.value / "src"
to your build.sbt, cause sbt document reads that "the directory that contains the main Scala sources is by default src/main/scala".
Or just make your source code in this structure
$ find .
./src
./src/main
./src/main/scala
./src/main/scala/analyzer
./src/main/scala/analyzer/Sentiment.scala
./src/main/scala/com
./src/main/scala/com/databricks
./src/main/scala/com/databricks/spark
./src/main/scala/com/databricks/spark/corenlp
./src/main/scala/com/databricks/spark/corenlp/CoreNLP.scala
./src/main/scala/com/databricks/spark/corenlp/functions.scala
./src/main/scala/com/databricks/spark/corenlp/StanfordCoreNLPWrapper.scala

Related

NoClassDefFoundError: scala/Product$class in spark app

I am building a Spark application with bash script and I have an only spark-sql and core dependencies in the build.sbt file. So every time I call some rdd methods or convert the data to case class for dataset creation I get this error:
Caused by: java.lang.NoClassDefFoundError: scala/Product$class
I suspect that it is a dependency error. So how should I change my dependencies to fix this?
dependencies list:
import sbt._
object Dependencies {
lazy val scalaCsv = "com.github.tototoshi" %% "scala-csv" % "1.3.5"
lazy val sparkSql = "org.apache.spark" %% "spark-sql" % "2.3.3"
lazy val sparkCore = "org.apache.spark" %% "spark-core" % "2.3.3"
}
build.sbt file:
import Dependencies._
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
scalaVersion := "2.11.12",
version := "test"
)),
name := "project",
libraryDependencies ++= Seq(scalaCsv, sparkSql, sparkCore),
mainClass in (Compile, run) := Some("testproject.spark.Main")
)
I launch spark app with spark 2.3.3 as my spark home directory like this:
#!/bin/sh
$SPARK_HOME/bin/spark-submit \
--class "testproject.spark.Main " \
--master local[*] \
target/scala-2.11/test.jar
Not sure what was the problem exactly, however, I have recreated the project and moved the source code there. The error disappeared

SparkSubmit Exception (NoClassDefFoundError), even though SBT compiles and packages sucessfully

I want to use ShapeLogic Scala combined with Spark. I am using Scala 2.11.8, Spark 2.1.1 and ShapeLogic Scala 0.9.0.
I sucessfully imported the classes to manage images with Spark. Also, I sucessfully compiled and packed (by using SBT) the following application in order to spark-submitting it to a cluster.
The following application simply opens an image and write it to a folder:
// imageTest.scala
import org.apache.spark.sql.SparkSession
import org.shapelogic.sc.io.LoadImage
import org.shapelogic.sc.image.BufferImage
import org.shapelogic.sc.io.BufferedImageConverter
object imageTestObj {
def main(args: Array[String]) {
// Create a Scala Spark Session
val spark = SparkSession.builder().appName("imageTest").master("local").getOrCreate();
val inPathStr = "/home/vitrion/IdeaProjects/imageTest";
val outPathStr = "/home/vitrion/IdeaProjects/imageTest/output";
val empty = new BufferImage[Byte](0, 0, 0, Array());
var a = Array.fill(3)(empty);
for (i <- 0 to 3) {
val imagePath = inPathStr + "IMG_" + "%01d".format(i + 1);
a(i) = LoadImage.loadBufferImage(inPathStr);
}
val sc = spark.sparkContext;
val imgRDD = sc.parallelize(a);
imgRDD.map { outBufferImage =>
val imageOpt = BufferedImageConverter.bufferImage2AwtBufferedImage(outBufferImage)
imageOpt match {
case Some(bufferedImage) => {
LoadImage.saveAWTBufferedImage(bufferedImage, "png", outPathStr)
println("Saved " + outPathStr)
}
case None => {
println("Could not convert image")
}
}
}
}
}
This is my SBT file
name := "imageTest"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.1.1" % "provided",
"org.apache.spark" % "spark-sql_2.11" % "2.1.1" % "provided",
"org.shapelogicscala" %% "shapelogic" % "0.9.0" % "provided"
)
However, the following error appears. When the package SBT command is executed, it seems like the ShapeLogic Scala dependencies are not included in the application JAR:
[vitrion#mstr scala-2.11]$ pwd
/home/vitrion/IdeaProjects/imageTest/target/scala-2.11
[vitrion#mstr scala-2.11]$ ls
classes imagetest_2.11-0.1.jar resolution-cache
[vitrion#mstr scala-2.11]$ spark-submit --class imageTestObj imagetest_2.11-0.1.jar
Exception in thread "main" java.lang.NoClassDefFoundError: org/shapelogic/sc/image/BufferImage
at imageTestObj.main(imageTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.shapelogic.sc.image.BufferImage
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I hope someone can help me to solve it?
Thank you very much
This error says everything:
Caused by: java.lang.ClassNotFoundException: org.shapelogic.sc.image.BufferImage
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
Add this missing dependency (ShapeLogic class) ------> org.shapelogic.sc.image.BufferImage, that should resolve the issue. Maven or SBT both should give the same error, if you miss this dependency!!
Since you are working on the cluster mode, you can directly add dependencies using --jars on spark-submit, please follow this post for more details.
These threads might help you:
Link1
Link2
Your dependencies listed in sbt files will not be included by default in your jar submitted to spark, so for sbt you have to use a plugin to build an uber/fat jar that would include shapelogicscala classes. You can use this link on SO, How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA? to see how you can manage this with sbt.

Scala - Spark-corenlp - java.lang.NoClassDefFoundError

I want to run spark-coreNLP example, but I get an java.lang.NoClassDefFoundError error when running spark-submit.
Here is the scala code, from the github example, which I put into an object, and defined a SparkContext and SQLContext
main.scala.Sentiment.scala
package main.scala
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SQLContext
import com.databricks.spark.corenlp.functions._
object SQLContextSingleton {
#transient private var instance: SQLContext = _
def getInstance(sparkContext: SparkContext): SQLContext = {
if (instance == null) {
instance = new SQLContext(sparkContext)
}
instance
}
}
object Sentiment {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Sentiment")
val sc = new SparkContext(conf)
val sqlContext = SQLContextSingleton.getInstance(sc)
import sqlContext.implicits._
val input = Seq((1, "<xml>Stanford University is located in California. It is a great university.</xml>")).toDF("id", "text")
val output = input
.select(cleanxml('text).as('doc))
.select(explode(ssplit('doc)).as('sen))
.select('sen, tokenize('sen).as('words), ner('sen).as('nerTags), sentiment('sen).as('sentiment))
output.show(truncate = false)
}
}
And my build.sbt (modified from here)
version := "1.0"
scalaVersion := "2.10.6"
scalaSource in Compile := baseDirectory.value / "src"
initialize := {
val _ = initialize.value
val required = VersionNumber("1.8")
val current = VersionNumber(sys.props("java.specification.version"))
assert(VersionNumber.Strict.isCompatible(current, required), s"Java $required required.")
}
sparkVersion := "1.5.2"
// change the value below to change the directory where your zip artifact will be created
spDistDirectory := target.value
sparkComponents += "mllib"
// add any sparkPackageDependencies using sparkPackageDependencies.
// e.g. sparkPackageDependencies += "databricks/spark-avro:0.1"
spName := "databricks/spark-corenlp"
licenses := Seq("GPL-3.0" -> url("http://opensource.org/licenses/GPL-3.0"))
resolvers += Resolver.mavenLocal
libraryDependencies ++= Seq(
"edu.stanford.nlp" % "stanford-corenlp" % "3.6.0",
"edu.stanford.nlp" % "stanford-corenlp" % "3.6.0" classifier "models",
"com.google.protobuf" % "protobuf-java" % "2.6.1"
)
I run sbt package without issue, then run Spark with
spark-submit --class "main.scala.Sentiment" --master local[4] target/scala-2.10/sentimentanalizer_2.10-1.0.jar
The program fails after throwing an exception:
Exception in thread "main" java.lang.NoClassDefFoundError: edu/stanford/nlp/simple/Sentence
at main.scala.com.databricks.spark.corenlp.functions$$anonfun$cleanxml$1.apply(functions.scala:55)
at main.scala.com.databricks.spark.corenlp.functions$$anonfun$cleanxml$1.apply(functions.scala:54)
at org.apache.spark.sql.catalyst.expressions.ScalaUDF$$anonfun$2.apply(ScalaUDF.scala:75)
at org.apache.spark.sql.catalyst.expressions.ScalaUDF$$anonfun$2.apply(ScalaUDF.scala:74)
Things I tried:
I work with Eclipse for Scala, and I made sure to add all the jars from stanford-corenlp as suggested here
./stanford-corenlp/ejml-0.23.jar
./stanford-corenlp/javax.json-api-1.0-sources.jar
./stanford-corenlp/javax.json.jar
./stanford-corenlp/joda-time-2.9-sources.jar
./stanford-corenlp/joda-time.jar
./stanford-corenlp/jollyday-0.4.7-sources.jar
./stanford-corenlp/jollyday.jar
./stanford-corenlp/protobuf.jar
./stanford-corenlp/slf4j-api.jar
./stanford-corenlp/slf4j-simple.jar
./stanford-corenlp/stanford-corenlp-3.6.0-javadoc.jar
./stanford-corenlp/stanford-corenlp-3.6.0-models.jar
./stanford-corenlp/stanford-corenlp-3.6.0-sources.jar
./stanford-corenlp/stanford-corenlp-3.6.0.jar
./stanford-corenlp/xom-1.2.10-src.jar
./stanford-corenlp/xom.jar
I suspect that I need to add something to my command line when submitting the job to Spark, any thoughts?
I was on the right track that my command line was missing something.
spark-submit needs to have all the stanford-corenlp added:
spark-submit
--jars $(echo stanford-corenlp/*.jar | tr ' ' ',')
--class "main.scala.Sentiment"
--master local[4] target/scala-2.10/sentimentanalizer_2.10-1.0.jar

ZeroMQ word count app gives error when you compile in spark 1.2.1

I'm trying to setup zeromq data stream to spark. Basically I took the ZeroMQWordCount.scala app an tried to recompile it and run it.
I have zeromq 2.1 installed, and spark 1.2.1
here is my scala code:
package org.apache.spark.examples.streaming
import akka.actor.ActorSystem
import akka.actor.actorRef2Scala
import akka.zeromq._
import akka.zeromq.Subscribe
import akka.util.ByteString
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.zeromq._
import scala.language.implicitConversions
import org.apache.spark.SparkConf
object ZmqBenchmark {
def main(args: Array[String]) {
if (args.length < 2) {
System.err.println("Usage: ZmqBenchmark <zeroMQurl> <topic>")
System.exit(1)
}
//StreamingExamples.setStreamingLogLevels()
val Seq(url, topic) = args.toSeq
val sparkConf = new SparkConf().setAppName("ZmqBenchmark")
// Create the context and set the batch size
val ssc = new StreamingContext(sparkConf, Seconds(2))
def bytesToStringIterator(x: Seq[ByteString]) = (x.map(_.utf8String)).iterator
// For this stream, a zeroMQ publisher should be running.
val lines = ZeroMQUtils.createStream(ssc, url, Subscribe(topic), bytesToStringIterator _)
val words = lines.flatMap(_.split(" "))
val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
wordCounts.print()
ssc.start()
ssc.awaitTermination()
}
}
and this is my .sbt file for dependencies:
name := "ZmqBenchmark"
version := "1.0"
scalaVersion := "2.10.4"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Sonatype (releases)" at "https://oss.sonatype.org/content/repositories/releases/"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.2.1"
libraryDependencies += "org.apache.spark" % "spark-streaming-zeromq_2.10" % "1.2.1"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.2.0"
libraryDependencies += "org.zeromq" %% "zeromq-scala-binding" % "0.0.6"
libraryDependencies += "com.typesafe.akka" % "akka-zeromq_2.10.0-RC5" % "2.1.0-RC6"
libraryDependencies += "org.apache.spark" % "spark-examples_2.10" % "1.1.1"
libraryDependencies += "org.spark-project.zeromq" % "zeromq-scala-binding_2.11" % "0.0.7-spark"
The application compiles without any errors using sbt package, however when i run the application with spark submit, i get an error:
zaid#zaid-VirtualBox:~/spark-1.2.1$ ./bin/spark-submit --master local[*] ./zeromqsub/example/target/scala-2.10/zmqbenchmark_2.10-1.0.jar tcp://127.0.0.1:5553 hello
15/03/06 10:21:11 WARN Utils: Your hostname, zaid-VirtualBox resolves to a loopback address: 127.0.1.1; using 192.168.220.175 instead (on interface eth0)
15/03/06 10:21:11 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/zeromq/ZeroMQUtils$
at ZmqBenchmark$.main(ZmqBenchmark.scala:78)
at ZmqBenchmark.main(ZmqBenchmark.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.zeromq.ZeroMQUtils$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
Any ideas why this happens? i know the app should work because when i run the same example using the $/run-example $ script and point to the ZeroMQWordCount app from spark, it runs without the exception. My guess is the sbt file is incorrect, what else do I need to have in the sbt file?
Thanks
You are using ZeroMQUtils.createStream but the line
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.zeromq.ZeroMQUtils
shows that the bytecode for ZeroMQUtils was not located. When the spark examples are run, they are run against a jar file (like spark-1.2.1/examples/target/scala-2.10/spark-examples-1.2.1-hadoop1.0.4.jar) including the ZeroMQUtils class. A solution would be to use the --jars flag so spark-submit command can find the bytecode. In your case, this could be something like
spark-submit --jars /opt/spark/spark-1.2.1/examples/target/scala-2.10/spark-examples-1.2.1-hadoop1.0.4.jar--master local[*] ./zeromqsub/example/target/scala-2.10/zmqbenchmark_2.10-1.0.jar tcp://127.0.0.1:5553 hello
assuming that you have installed spark-1.2.1 in /opt/spark.

Spark Scala Error: Exception in thread "main" java.lang.ClassNotFoundException

I tried to run a spark job on a yarn cluster written in Scala, and run into this error:
[!##$% spark-1.0.0-bin-hadoop2]$ export HADOOP_CONF_DIR="/etc/hadoop/conf"
[!##$% spark-1.0.0-bin-hadoop2]$ ./bin/spark-submit --class "SimpleAPP" \
> --master yarn-client \
> test_proj/target/scala-2.10/simple-project_2.10-0.1.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" java.lang.ClassNotFoundException: SimpleAPP
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:289)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
And this is my sbt file:
[!##$% test_proj]$ cat simple.sbt
name := "Simple Project"
version := "0.1"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
// We need to be able to write Avro in Parquet
// libraryDependencies += "com.twitter" % "parquet-avro" % "1.3.2"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
this is my SimpleApp.scala program, it is the canonical one:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp{
def main(args: Array[String]) {
val logFile = "/home/myname/spark-1.0.0-bin-hadoop2/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
sbt package is as following:
[!##$% test_proj]$ sbt package
[info] Set current project to Simple Project (in build file:/home/myname/spark-1.0.0-bin-hadoop2/test_proj/)
[info] Compiling 1 Scala source to /home/myname/spark-1.0.0-bin-hadoop2/test_proj/target/scala-2.10/classes...
[info] Packaging /home/myname/spark-1.0.0-bin-hadoop2/test_proj/target/scala-2.10/simple-project_2.10-0.1.jar ...
[info] Done packaging.
[success] Total time: 12 s, completed Mar 3, 2015 10:57:12 PM
As suggested, I did the following:
jar tf simple-project_2.10-0.1.jar | grep .class
Something as followed shows up:
SimpleApp$$anonfun$1.class
SimpleApp$.class
SimpleApp$$anonfun$2.class
SimpleApp.class
Verify if the name is SimpleAPP in the jar.
Do this:
jar tf simple-project_2.10-0.1.jar | grep .class
And check if the name of the class is right.