Error while starting scala project that uses SORM framework - scala

I have created simple sbt project from tutorial:
build.sbt:
lazy val sorm_test = (project in file(".")).
settings(
name := "SORM_TEST",
scalaVersion := "2.11.7",
libraryDependencies ++= Seq(
"org.sorm-framework" % "sorm" % "0.3.18",
"com.h2database" % "h2" % "1.4.188"
)
)
test.Main.scala:
package test
case class Artist(
names : Map[Locale, Seq[String]],
genres : Set[Genre]
)
case class Genre(
names : Map[Locale, Seq[String]]
)
case class Locale(
code : String
)
import sorm._
object Db extends Instance(
entities = Set(
Entity[Artist](),
Entity[Genre](),
Entity[Locale](unique = Set() + Seq("code"))
),
url = "jdbc:h2:mem:test",
user = "",
password = "",
initMode = InitMode.Create
)
object Main extends App {
// init
Db.##
}
When i run this project in intellij i have such exceptions :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.parse(ToolBoxFactory.scala:414)
at sorm.persisted.PersistedClass$.createClass(PersistedClass.scala:107)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:194)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:80)
at sorm.persisted.PersistedClass$$anon$1.resolve(PersistedClass.scala:125)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sorm.persisted.PersistedClass$.apply(PersistedClass.scala:129)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at embrace.package$EmbraceAny$.$$extension(package.scala:6)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at scala.collection.immutable.Set$Set3.foreach(Set.scala:145)
at sorm.Instance$Initialization.<init>(Instance.scala:239)
at sorm.Instance.<init>(Instance.scala:38)
at test.Db$.<init>(Main.scala:15)
at test.Db$.<clinit>(Main.scala)
at test.Main$.delayedEndpoint$test$Main$1(Main.scala:29)
at test.Main$delayedInit$body.apply(Main.scala:27)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at test.Main$.main(Main.scala:27)
at test.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
... 38 more
Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 38 more
It's ok when i use sbt run. This exception is thrown also when i integrate SORM with Play framework. How can i solve this problem ?

I solved this problem by adding dependencyOverrides += "org.scala-lang" % "scala-compiler" % scalaVersion.value to my sbt configuration.

I receive a very similar Exception in a maven project during runtime:
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction0$mcI$sp
I am currently using
<scala.version>2.11.7</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<scalatest.version>2.2.2</scalatest.version>
<scala-maven-plugin.version>3.2.0</scala-maven-plugin.version>
in the pom properties and refer to these properties in all other entries, like
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
Maven packaging/installation works fine. Is this a similar problem? Can someone explain the solution above (or maybe translate it to the maven case)?

Related

How to use phantom in scala main object

I'm trying to use phantom to handle my Cassandra
This is my example to learn how can I use phantom (thanks to Thiago)
https://github.com/iamthiago/cassandra-phantom
I can run successfully 'SongTest'.
But I have some trouble to run database.create in my scala main object.
Same code in 'scalatest' successfully run. But in my scala main object, it fail.
This is my source code
package com.cassandra.phantom.modeling
import com.cassandra.phantom.modeling.database._
import com.cassandra.phantom.modeling.connector._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._
/**
* Created by karamko on 2017. 3. 28..
*/
class TestCass extends EmbeddedDatabase with Connector.connector.Connector {
def create() {
database.create(5.seconds)
}
}
object Main extends App{
val test = new TestCass
test.create()
}
and This is my error
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn$class.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:24)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:24)
at com.outworkers.phantom.column.AbstractColumn$class.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:24)
at com.outworkers.phantom.column.PrimitiveColumn.qb(PrimitiveColumn.scala:38)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:200)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:92)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.outworkers.phantom.builder.query.RootCreateQuery.lightweight(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery.ifNotExists(CreateQuery.scala:71)
at com.outworkers.phantom.CassandraTable.autocreate(CassandraTable.scala:92)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.outworkers.phantom.database.ExecutableCreateStatementsList.future(Database.scala:173)
at com.outworkers.phantom.database.Database.createAsync(Database.scala:85)
at com.outworkers.phantom.database.Database.create(Database.scala:75)
at com.cassandra.phantom.modeling.SongsStreaming$.main(Main.scala:29)
at com.cassandra.phantom.modeling.SongsStreaming.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: scala.reflect.runtime.package$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
I solve my problem adding below dependency
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
Thanks a lot

Could not copy native binaries to temp directory Kinesis KPL

I am trying to build a simple application that will produce messages to Kinesis using the KPL. I am writing this in scala and am receiving an error message that I can't seem to figure out. My code is as follows:
import java.nio.ByteBuffer
import com.amazonaws.services.kinesis.producer.{KinesisProducer, KinesisProducerConfiguration}
object KinesisStream extends App{
ProduceToKinesis()
def ProduceToKinesis(): Unit ={
val config = new KinesisProducerConfiguration()
val kinesis = new KinesisProducer(config)
val data = ByteBuffer.wrap("myData".getBytes("UTF-8"))
kinesis.addUserRecord("TestStream", "myPartitionKey", data)
}
}
it fails at
val kinesis = new KinesisProducer(config)
with an error message of:
Exception in thread "main" java.lang.RuntimeException: Could not copy native binaries to temp directory C:\Users\************\AppData\Local\Temp\amazon-kinesis-producer-native-binaries
at com.amazonaws.services.kinesis.producer.KinesisProducer.extractBinaries(KinesisProducer.java:844)
at com.amazonaws.services.kinesis.producer.KinesisProducer.<init>(KinesisProducer.java:242)
at KinesisStream$.ProduceToKinesis(KinesisStream.scala:14)
at KinesisStream$.delayedEndpoint$KinesisStream$1(KinesisStream.scala:9)
at KinesisStream$delayedInit$body.apply(KinesisStream.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at KinesisStream$.main(KinesisStream.scala:8)
at KinesisStream.main(KinesisStream.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.NullPointerException
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1792)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1769)
at org.apache.commons.io.IOUtils.copy(IOUtils.java:1744)
at org.apache.commons.io.IOUtils.toByteArray(IOUtils.java:462)
at com.amazonaws.services.kinesis.producer.KinesisProducer.extractBinaries(KinesisProducer.java:803)
... 18 more
My Build.SBT looks like this:
name := "Kinesis"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "com.amazonaws" % "amazon-kinesis-producer" % "0.12.1"
I know it's been long since this issue was posted. But recently I fell into the same issue and couldn't find much on the web. Fortunately I'v found the solution, this could be helpful to someone else wandering for the fix.
It's a version issue as mentioned here https://github.com/awslabs/amazon-kinesis-producer/issues/113#issuecomment-345028662. It worked for me when downgraded the KPL version to 0.13.1
Here's the pom dependency worked for me.
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-producer</artifactId>
<version>0.13.1</version>
</dependency>

Issues while submitting jars to spark cluster

I was trying to create a basic job in scala using the IntelliJ. Using the following code i have to build the scala and create a jar using sbt assembly. Then would submit this jars along with spark-cassandra connector to spark cluster. So, my question is how do i test my scala code without creating the jar in Intellij.
Also, every time i changes something in my build.sbt file. it starts a background task of downloading the dependencies even though i have put provided in the build.sbt file. So, how do i make it one time?
Code :
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.cassandra.CassandraSQLContext
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf(true).set("spark.cassandra.connection.host", "Cluster_IP")
val sc = new SparkContext("spark://naresh-pc:7077", "test", conf)
val csc = new CassandraSQLContext(sc)
csc.setKeyspace("KEYSPACE_NAME")
val rdd = csc.sql("Some_Query")
rdd.collect().foreach(a => println(a))
}
}
Build.scala :
name := "SparkCassandraDemo"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1" % "provided"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0-M1" % "provided"
libraryDependencies += "org.apache.spark".%%("spark-sql") % "1.6.1" % "provided"
Edited question :
I have implemented what Yuval Itzchakov suggested. But i am getting the following error :
FYI, earlier i used to submit the job in the following manner after creating the jar using sbt assembly :
bin/spark-submit --class SimpleApp --master spark://naresh-pc:7077 --jars SOME_PATH/SparkCassandraDemo-assembly-1.0.jar SOME_PATH/spark-cassandra-connector-assembly-1.6.0-M1.jar
Which actaully uses the spark-cassandra-connector-assembly. So, i guess it is not able to find that jar. So, how do i make it available to the code.
Error :
Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange rangepartitioning(cnt#0L ASC,200), None
+- ConvertToSafe
+- TungstenAggregate(key=[useragent#10], functions=[(count(if ((gid#12 = 1)) cookie#13 else null),mode=Final,isDistinct=false)], output=[cnt#0L,useragent#10])
+- TungstenExchange hashpartitioning(useragent#10,200), None
+- TungstenAggregate(key=[useragent#10], functions=[(count(if ((gid#12 = 1)) cookie#13 else null),mode=Partial,isDistinct=false)], output=[useragent#10,count#16L])
+- TungstenAggregate(key=[useragent#10,cookie#13,gid#12], functions=[], output=[useragent#10,cookie#13,gid#12])
+- TungstenExchange hashpartitioning(useragent#10,cookie#13,gid#12,200), None
+- TungstenAggregate(key=[useragent#10,cookie#13,gid#12], functions=[], output=[useragent#10,cookie#13,gid#12])
+- Expand [List(useragent#10, cookie#3, 1)], [useragent#10,cookie#13,gid#12]
+- Scan org.apache.spark.sql.cassandra.CassandraSourceRelation#5d1094[useragent#10,cookie#3]
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.ConvertToUnsafe.doExecute(rowFormatConverters.scala:38)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:166)
at org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:174)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
at org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$execute$1(DataFrame.scala:1498)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$collect$1.apply(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$collect$1.apply(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2099)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$collect(DataFrame.scala:1503)
at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:1480)
at SimpleApp$.main(SimpleApp.scala:17)
at SimpleApp.main(SimpleApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 9, pratik-VirtualBox): java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
at org.apache.spark.RangePartitioner$.sketch(Partitioner.scala:264)
at org.apache.spark.RangePartitioner.<init>(Partitioner.scala:126)
at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:179)
at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
... 34 more
Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
So, my question is how do i test my scala code without creating the
jar in Intellij
One way of achieving this is creating another module, which doesn't use the provided sbt setting but actually compiles the spark jars in order for you to be able to debug your code.
You start by creating an additional module in build.sbt:
name := "SparkCassandraDemo"
version := "1.0"
scalaVersion := "2.11.8"
val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0-M2",
"org.apache.spark".%%("spark-sql") % "1.6.1"
)
lazy val sparkDebugger = (project in file("spark-debugger"))
.settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
libraryDependencies ++= sparkDependencies.map(_ % "provided")
After that, refresh your build.sbt file. You should now see a new module created in the left hand side of IntelliJ called spark-debugger:
Now, create a debug configuration in Intellij:
Go to Edit Configuration:
Create a new application configuration:
Set the newly created spark-debugger module:
Shift + Ctrl + F9, and select the newly created configuration:
Debug your code:

PlayFramework NoClassDefFoundError: jdk/nashorn/api/scripting/NashornScriptEngineFactory

Calling the NashornScriptEngineFactory yields a RuntimeException on a simple PlayFramework application.
build.sbt
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
scalacOptions += "-target:jvm-1.8"
Application.scala
import jdk.nashorn.api.scripting.NashornScriptEngineFactory
import play.api.mvc._
object Application extends Controller
{
def index = Action
{
val nashorn = new NashornScriptEngineFactory().getScriptEngine( "-scripting" )
Ok( nashorn.eval( "3;" ).toString )
}
}
On a similar sbt project however, it works:
Main.scala
import jdk.nashorn.api.scripting.NashornScriptEngineFactory
object Main extends App
{
val nashorn = new NashornScriptEngineFactory().getScriptEngine( "-scripting" )
println( nashorn.eval( "4;" ) )
}
Why is play failing to load the required factory?
Stacktrace
[error] application -
! #6locb3604 - Internal server error, for (GET) [/] ->
play.api.Application$$anon$1: Execution exception[[RuntimeException: java.lang.NoClassDefFoundError: jdk/nashorn/api/scripting/NashornScriptEngineFactory]]
at play.api.Application$class.handleError(Application.scala:296) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.DefaultApplication.handleError(Application.scala:402) [play_2.11-2.3.8.jar:2.3.8]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:205) [play_2.11-2.3.8.jar:2.3.8]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:202) [play_2.11-2.3.8.jar:2.3.8]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) [scala-library-2.11.6.jar:na]
Caused by: java.lang.RuntimeException: java.lang.NoClassDefFoundError: jdk/nashorn/api/scripting/NashornScriptEngineFactory
at play.api.mvc.ActionBuilder$$anon$1.apply(Action.scala:523) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:130) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:130) ~[play_2.11-2.3.8.jar:2.3.8]
at play.utils.Threads$.withContextClassLoader(Threads.scala:21) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:129) ~[play_2.11-2.3.8.jar:2.3.8]
Caused by: java.lang.NoClassDefFoundError: jdk/nashorn/api/scripting/NashornScriptEngineFactory
at controllers.Application$$anonfun$index$1.apply(Application.scala:10) ~[classes/:na]
at controllers.Application$$anonfun$index$1.apply(Application.scala:9) ~[classes/:na]
at play.api.mvc.ActionBuilder$$anonfun$apply$17.apply(Action.scala:464) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.mvc.ActionBuilder$$anonfun$apply$17.apply(Action.scala:464) ~[play_2.11-2.3.8.jar:2.3.8]
at play.api.mvc.ActionBuilder$$anonfun$apply$16.apply(Action.scala:433) ~[play_2.11-2.3.8.jar:2.3.8]
Caused by: java.lang.ClassNotFoundException: jdk.nashorn.api.scripting.NashornScriptEngineFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_40]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_40]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_40]
at controllers.Application$$anonfun$index$1.apply(Application.scala:10) ~[classes/:na]
at controllers.Application$$anonfun$index$1.apply(Application.scala:9) ~[classes/:na]
This is related to https://github.com/playframework/playframework/pull/3420 and works in play 2.4.x!
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0-M3")

ScalaTest : Exception during running test - java.lang.IncompatibleClassChangeError

I am trying to run a test suite written in Scala through commandline and I get following exception.
Scala version: 2.10.2
ScalaTest version: 2.10
Following is dependency declared in build.sbt
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0" % "test"
Following is the stack trace of exception
java.lang.IncompatibleClassChangeError: Found class scala.collection.mutable.ArrayOps, but interface was expected
at org.scalatest.tools.Runner$.main(Runner.scala:828)
at org.scalatest.tools.Runner.main(Runner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)