ScalaTest : Exception during running test - java.lang.IncompatibleClassChangeError - scala

I am trying to run a test suite written in Scala through commandline and I get following exception.
Scala version: 2.10.2
ScalaTest version: 2.10
Following is dependency declared in build.sbt
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0" % "test"
Following is the stack trace of exception
java.lang.IncompatibleClassChangeError: Found class scala.collection.mutable.ArrayOps, but interface was expected
at org.scalatest.tools.Runner$.main(Runner.scala:828)
at org.scalatest.tools.Runner.main(Runner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)

Related

Scallop Verify() method with scala version `2.10.5` not working

I am using scalaVersion := "2.10.5" and libraryDependencies += "org.rogach" %% "scallop" % "3.1.2".
Getting following error: Exception in thread "main"
java.lang.NoSuchMethodError:
scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object; at
org.rogach.scallop.DefaultConverters$$anon$2.parse(DefaultConverters.scala:27)
at
org.rogach.scallop.ValueConverter$class.parseCached(ValueConverter.scala:21)
at
org.rogach.scallop.DefaultConverters$$anon$2.parseCached(DefaultConverters.scala:24)
at
org.rogach.scallop.Scallop$$anonfun$verify$17.apply(Scallop.scala:632)
at
org.rogach.scallop.Scallop$$anonfun$verify$17.apply(Scallop.scala:630)
at scala.collection.immutable.List.foreach(List.scala:381) at
org.rogach.scallop.Scallop.verify(Scallop.scala:630) at
org.rogach.scallop.ScallopConfBase.verifyBuilder(ScallopConfBase.scala:405)
at
org.rogach.scallop.ScallopConfBase.verify(ScallopConfBase.scala:744)
at
com.unity3d.ads.conf.OperativeEventConverterConf.(OperativeEventConverterConf.scala:50)
at com.unity3d.ads.analytics.TestClass$.main(TestClass.scala:51) at
com.unity3d.ads.analytics.TestClass.main(TestClass.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The exact same code is working fine with scalaVersion := "2.11.8"
Unfortunately, I have to use 2.10.5 because I am using spark version 1.6.
sample code:
`import org.rogach.scallop.{ScallopConf, ScallopOption, Serialization, ValueConverter, singleArgConverter}
class TestClass(args: Seq[String]) extends ScallopConf(args) with Serialization {
val testInput: ScallopOption[String] =
opt[String](
name = "test.input",
descr = "test",
required = false,
default = Option("testPath"))
verify()
}
`
Is there any workaround I can use here to make it work with scala 2.10.5?
Answering my question to in case someone else faces the similar issue.
This turned out to be the classpath issue. the root cause of the issue:
I was using spark 2.1 to run code compiled with spark 1.6 version. apparently, 1.6 uses scala 2.10.. while spark 2.1 uses scala 2.11...

java.lang.ClassNotFoundException: SparkSql at java.net.URLClassLoader.findClass(Unknown Source)

SparkSql.scala
import org.apache.spark.sql.SparkSession
object SparkSql
{
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder().master("local[*]").appName("SparkSql1").getOrCreate()
//val data = spark.sparkContext.textFile(("C:\\Users\\blaskar\\Desktop\\Biswajit\\data\\sample.txt"))
val data = spark.read.format("csv").option("inferSchema","true").option("header","true").load("C:\\Users\\blaskar\\Desktop\\Biswajit\\data\\SampleCSVFile.csv")
val columnsRenamed = Seq("S.no", "Eldonbase", "platnium","Macintyre","count1","count2","count3","Nunavut","Storage","count4")
val original_data = data.toDF(columnsRenamed:_*).persist()
original_data.show(10)
}
}
build.sbt
name := "Spark-Sample1"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.1.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-sql_2.11" % "2.1.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"))
VERSIONS OF SOFTWARE THE SYSTEM IS USING
spark: 2.1.0
scala: 2.11.8
I am using IntelliJ. The code is running fine but when I am submitting the job to my standalone cluster it is showing the below error:
SPARK-SUBMIT ERROR:
spark-submit --class "SparkSql" C:\Users\blaskar\Desktop\sbt_projects\Spark-Sample1
\out\artifacts\Spark_Sample1_jar\Spark-Sample1.jar
ERROR
java.lang.ClassNotFoundException: SparkSql
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at org.apache.spark.util.Utils$.classForName(Utils.scala:229)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:695)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

How to use phantom in scala main object

I'm trying to use phantom to handle my Cassandra
This is my example to learn how can I use phantom (thanks to Thiago)
https://github.com/iamthiago/cassandra-phantom
I can run successfully 'SongTest'.
But I have some trouble to run database.create in my scala main object.
Same code in 'scalatest' successfully run. But in my scala main object, it fail.
This is my source code
package com.cassandra.phantom.modeling
import com.cassandra.phantom.modeling.database._
import com.cassandra.phantom.modeling.connector._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._
/**
* Created by karamko on 2017. 3. 28..
*/
class TestCass extends EmbeddedDatabase with Connector.connector.Connector {
def create() {
database.create(5.seconds)
}
}
object Main extends App{
val test = new TestCass
test.create()
}
and This is my error
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn$class.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:24)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:24)
at com.outworkers.phantom.column.AbstractColumn$class.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:24)
at com.outworkers.phantom.column.PrimitiveColumn.qb(PrimitiveColumn.scala:38)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:200)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:92)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.outworkers.phantom.builder.query.RootCreateQuery.lightweight(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery.ifNotExists(CreateQuery.scala:71)
at com.outworkers.phantom.CassandraTable.autocreate(CassandraTable.scala:92)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.outworkers.phantom.database.ExecutableCreateStatementsList.future(Database.scala:173)
at com.outworkers.phantom.database.Database.createAsync(Database.scala:85)
at com.outworkers.phantom.database.Database.create(Database.scala:75)
at com.cassandra.phantom.modeling.SongsStreaming$.main(Main.scala:29)
at com.cassandra.phantom.modeling.SongsStreaming.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: scala.reflect.runtime.package$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
I solve my problem adding below dependency
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
Thanks a lot

Issues while submitting jars to spark cluster

I was trying to create a basic job in scala using the IntelliJ. Using the following code i have to build the scala and create a jar using sbt assembly. Then would submit this jars along with spark-cassandra connector to spark cluster. So, my question is how do i test my scala code without creating the jar in Intellij.
Also, every time i changes something in my build.sbt file. it starts a background task of downloading the dependencies even though i have put provided in the build.sbt file. So, how do i make it one time?
Code :
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.cassandra.CassandraSQLContext
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf(true).set("spark.cassandra.connection.host", "Cluster_IP")
val sc = new SparkContext("spark://naresh-pc:7077", "test", conf)
val csc = new CassandraSQLContext(sc)
csc.setKeyspace("KEYSPACE_NAME")
val rdd = csc.sql("Some_Query")
rdd.collect().foreach(a => println(a))
}
}
Build.scala :
name := "SparkCassandraDemo"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1" % "provided"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0-M1" % "provided"
libraryDependencies += "org.apache.spark".%%("spark-sql") % "1.6.1" % "provided"
Edited question :
I have implemented what Yuval Itzchakov suggested. But i am getting the following error :
FYI, earlier i used to submit the job in the following manner after creating the jar using sbt assembly :
bin/spark-submit --class SimpleApp --master spark://naresh-pc:7077 --jars SOME_PATH/SparkCassandraDemo-assembly-1.0.jar SOME_PATH/spark-cassandra-connector-assembly-1.6.0-M1.jar
Which actaully uses the spark-cassandra-connector-assembly. So, i guess it is not able to find that jar. So, how do i make it available to the code.
Error :
Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange rangepartitioning(cnt#0L ASC,200), None
+- ConvertToSafe
+- TungstenAggregate(key=[useragent#10], functions=[(count(if ((gid#12 = 1)) cookie#13 else null),mode=Final,isDistinct=false)], output=[cnt#0L,useragent#10])
+- TungstenExchange hashpartitioning(useragent#10,200), None
+- TungstenAggregate(key=[useragent#10], functions=[(count(if ((gid#12 = 1)) cookie#13 else null),mode=Partial,isDistinct=false)], output=[useragent#10,count#16L])
+- TungstenAggregate(key=[useragent#10,cookie#13,gid#12], functions=[], output=[useragent#10,cookie#13,gid#12])
+- TungstenExchange hashpartitioning(useragent#10,cookie#13,gid#12,200), None
+- TungstenAggregate(key=[useragent#10,cookie#13,gid#12], functions=[], output=[useragent#10,cookie#13,gid#12])
+- Expand [List(useragent#10, cookie#3, 1)], [useragent#10,cookie#13,gid#12]
+- Scan org.apache.spark.sql.cassandra.CassandraSourceRelation#5d1094[useragent#10,cookie#3]
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.ConvertToUnsafe.doExecute(rowFormatConverters.scala:38)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:166)
at org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:174)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
at org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$execute$1(DataFrame.scala:1498)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$collect$1.apply(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$collect$1.apply(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2099)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$collect(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:1480)
at SimpleApp$.main(SimpleApp.scala:17)
at SimpleApp.main(SimpleApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 9, pratik-VirtualBox): java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
at org.apache.spark.RangePartitioner$.sketch(Partitioner.scala:264)
at org.apache.spark.RangePartitioner.<init>(Partitioner.scala:126)
at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:179)
at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
... 34 more
Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
So, my question is how do i test my scala code without creating the
jar in Intellij
One way of achieving this is creating another module, which doesn't use the provided sbt setting but actually compiles the spark jars in order for you to be able to debug your code.
You start by creating an additional module in build.sbt:
name := "SparkCassandraDemo"
version := "1.0"
scalaVersion := "2.11.8"
val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0-M2",
"org.apache.spark".%%("spark-sql") % "1.6.1"
)
lazy val sparkDebugger = (project in file("spark-debugger"))
.settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
libraryDependencies ++= sparkDependencies.map(_ % "provided")
After that, refresh your build.sbt file. You should now see a new module created in the left hand side of IntelliJ called spark-debugger:
Now, create a debug configuration in Intellij:
Go to Edit Configuration:
Create a new application configuration:
Set the newly created spark-debugger module:
Shift + Ctrl + F9, and select the newly created configuration:
Debug your code:

Error while starting scala project that uses SORM framework

I have created simple sbt project from tutorial:
build.sbt:
lazy val sorm_test = (project in file(".")).
settings(
name := "SORM_TEST",
scalaVersion := "2.11.7",
libraryDependencies ++= Seq(
"org.sorm-framework" % "sorm" % "0.3.18",
"com.h2database" % "h2" % "1.4.188"
)
)
test.Main.scala:
package test
case class Artist(
names : Map[Locale, Seq[String]],
genres : Set[Genre]
)
case class Genre(
names : Map[Locale, Seq[String]]
)
case class Locale(
code : String
)
import sorm._
object Db extends Instance(
entities = Set(
Entity[Artist](),
Entity[Genre](),
Entity[Locale](unique = Set() + Seq("code"))
),
url = "jdbc:h2:mem:test",
user = "",
password = "",
initMode = InitMode.Create
)
object Main extends App {
// init
Db.##
}
When i run this project in intellij i have such exceptions :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.parse(ToolBoxFactory.scala:414)
at sorm.persisted.PersistedClass$.createClass(PersistedClass.scala:107)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:194)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:80)
at sorm.persisted.PersistedClass$$anon$1.resolve(PersistedClass.scala:125)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sorm.persisted.PersistedClass$.apply(PersistedClass.scala:129)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at embrace.package$EmbraceAny$.$$extension(package.scala:6)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at scala.collection.immutable.Set$Set3.foreach(Set.scala:145)
at sorm.Instance$Initialization.<init>(Instance.scala:239)
at sorm.Instance.<init>(Instance.scala:38)
at test.Db$.<init>(Main.scala:15)
at test.Db$.<clinit>(Main.scala)
at test.Main$.delayedEndpoint$test$Main$1(Main.scala:29)
at test.Main$delayedInit$body.apply(Main.scala:27)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at test.Main$.main(Main.scala:27)
at test.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
... 38 more
Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 38 more
It's ok when i use sbt run. This exception is thrown also when i integrate SORM with Play framework. How can i solve this problem ?
I solved this problem by adding dependencyOverrides += "org.scala-lang" % "scala-compiler" % scalaVersion.value to my sbt configuration.
I receive a very similar Exception in a maven project during runtime:
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction0$mcI$sp
I am currently using
<scala.version>2.11.7</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<scalatest.version>2.2.2</scalatest.version>
<scala-maven-plugin.version>3.2.0</scala-maven-plugin.version>
in the pom properties and refer to these properties in all other entries, like
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
Maven packaging/installation works fine. Is this a similar problem? Can someone explain the solution above (or maybe translate it to the maven case)?