Cannot Create SparkSession for Scala Without an Error in IntelliJ - scala

I am trying to create a SparkSession so I can use implicits._, but I get errors when running a simple app.
My build.sbt file looks like this:
name := "Reddit-Data-Analyser"
version := "0.1"
scalaVersion := "2.11.12"
fork := true
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.4.0"
resolvers += "MavenRepository" at "http://central.maven.org/maven2"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0",
"org.apache.spark" %% "spark-sql" % "2.3.0"
)
I get unresolved dependency errors on spark-sql, but it appears that the SparkSession class can still load.
My Main.scala looks like this:
import org.apache.spark.sql.SparkSession
object main extends App {
val spark = SparkSession
.builder()
.config("spark.master", "local")
//.config("spark.network.timeout", "10000s") //Not Relevant
//.config("spark.executor.heartbeatInterval", "5000s") //Not Relevant
.getOrCreate()
println("Hello World")
spark.stop()
}
*Edit: I actually was able to get the SparkSession to Run by invalidating caches and restarting (though I already did this many times so I am not sure what changed), now when I do ~run in the SBT console I get the [error] messages and have posted this question here about it: SparkSession logging to console with [error] logs.
Below are my old error messages:
The println does not execute, instead I first get the following ERROR output:
[error] (run-main-7) java.lang.AbstractMethodError
java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.sql.internal.SharedState.initializeLogIfNecessary(SharedState.scala:42)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.sql.internal.SharedState.log(SharedState.scala:42)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.sql.internal.SharedState.logInfo(SharedState.scala:42)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:71)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:112)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:112)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:112)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:111)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:284)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:938)
at controller.main$.delayedEndpoint$controller$main$1(Main.scala:20)
at controller.main$delayedInit$body.apply(Main.scala:11)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at controller.main$.main(Main.scala:11)
at controller.main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 9 s, completed Mar 14, 2019 9:43:29 PM
8. Waiting for source changes... (press enter to interrupt)
19/03/14 21:43:29 INFO AsyncEventQueue: Stopping listener queue executorManagement.
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
19/03/14 21:43:29 INFO AsyncEventQueue: Stopping listener queue appStatus.
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
19/03/14 21:43:29 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)

Have you tried something like:
import org.apache.spark.sql.SparkSession
object main extends App {
val spark = SparkSession
.builder()
.appName("myApp")
.config("master", "local[*]")
.getOrCreate()
println("Hello World")
println(spark.version())
spark.stop()
}

So I am not sure exactly what fixed the problem because after doing a number of sbt commands and changes I was eventually able to run my app.
Here is a list of things I did, but I think the sbt command in step #4 may have been the missing piece:
Changed from version := 2.12.5 to version := 2.11.12 in build.sbt. I believe that Apache spark has support for scala 2.12, but IntelliJ or sbt apparently has difficulties retrieving the packages.
Created file build.properties under project root directory and added line sbt.version = 0.13.17, since sbt 1.0 apparently isn't great at working with spark-core repository.
Ran the following sbt commands in this order: reload plugins,update,reload.
One of the last things I tried was running the sbt command package which Creates a jar file containing the files in src/main/resources and the classes compiled from src/main/scala and src/main/java. After doing this (and maybe a full Rebuild/Cache Invalidation) I noticed the missing Scala packages appeared in my External Libraries.
Did Rebuild and Invalidate Cache/Restart several times.

Related

ScalaTest: QuickStart code fails with java.lang.NoClassDefFoundError: scala/xml/NamespaceBinding

I am just starting out with ScalaTest and following http://www.scalatest.org/quick_start page
I followed the steps as outlined (by downloading the jar file linked above the code)
and the code fails
unittest vi ExampleSpec.scala
➜ unittest scalac -cp scalatest-app_2.13-3.0.8.jar ExampleSpec.scala
➜ unittest scala -cp scalatest-app_2.13-3.0.8.jar org.scalatest.run ExampleSpec
An exception or error caused a run to abort. This may have been caused by a problematic custom reporter.
java.lang.NoClassDefFoundError: scala/xml/NamespaceBinding
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1368)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:1033)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:1011)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1509)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1011)
at org.scalatest.tools.Runner$.main(Runner.scala:827)
at org.scalatest.run$.main(run.scala:120)
at org.scalatest.run.main(run.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at scala.reflect.internal.util.ScalaClassLoader.$anonfun$run$2(ScalaClassLoader.scala:105)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:40)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:130)
at scala.reflect.internal.util.ScalaClassLoader.run(ScalaClassLoader.scala:105)
at scala.reflect.internal.util.ScalaClassLoader.run$(ScalaClassLoader.scala:97)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:130)
at scala.tools.nsc.CommonRunner.run(ObjectRunner.scala:29)
at scala.tools.nsc.CommonRunner.run$(ObjectRunner.scala:28)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:43)
at scala.tools.nsc.CommonRunner.runAndCatch(ObjectRunner.scala:35)
at scala.tools.nsc.CommonRunner.runAndCatch$(ObjectRunner.scala:34)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:70)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:91)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:108)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.ClassNotFoundException: scala.xml.NamespaceBinding
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:436)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 29 more
My Scala Version is
➜ unittest scala -version
Scala code runner version 2.13.1 -- Copyright 2002-2019, LAMP/EPFL and Lightbend, Inc.
I thought the error is because scala-xml is a separate project than Scala, so I downloaded the jar file from https://mvnrepository.com/artifact/org.scala-lang/scala-xml/2.11.0-M4
Next, I tried the entire process again and it fails with different error
unittest scala -classpath "scalatest-app_2.13-3.0.8.jar:scala-xml-2.11.0-M4.jar" org.scalatest.run ExampleSpec
An exception or error caused a run to abort. This may have been caused by a problematic custom reporter.
java.lang.NoClassDefFoundError: scala/Serializable
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1016)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:151)
at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:515)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:423)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:417)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:691)
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:416)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1368)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:1033)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:1011)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1509)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1011)
at org.scalatest.tools.Runner$.main(Runner.scala:827)
at org.scalatest.run$.main(run.scala:120)
at org.scalatest.run.main(run.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at scala.reflect.internal.util.ScalaClassLoader.$anonfun$run$2(ScalaClassLoader.scala:105)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:40)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:130)
at scala.reflect.internal.util.ScalaClassLoader.run(ScalaClassLoader.scala:105)
at scala.reflect.internal.util.ScalaClassLoader.run$(ScalaClassLoader.scala:97)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:130)
at scala.tools.nsc.CommonRunner.run(ObjectRunner.scala:29)
at scala.tools.nsc.CommonRunner.run$(ObjectRunner.scala:28)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:43)
at scala.tools.nsc.CommonRunner.runAndCatch(ObjectRunner.scala:35)
at scala.tools.nsc.CommonRunner.runAndCatch$(ObjectRunner.scala:34)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:70)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:91)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:108)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.ClassNotFoundException: scala.Serializable
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:436)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 39 more
Could someone please help me understand what the issue is?
Thank you
UPDATE (SOLVED)
As per the comment from #J0HN and answer from #Mario, I was able to solve the issue
➜ unittest ll
total 17608
-rw-r--r-- 1 harit staff 504B Nov 21 17:15 ExampleSpec.scala
-rw-r--r--# 1 harit staff 555K Nov 22 10:35 scala-xml_2.13-1.2.0.jar
-rw-r--r--# 1 harit staff 7.9M Nov 21 16:59 scalatest-app_2.13-3.0.8.jar
➜ unittest scalac -classpath "scalatest-app_2.13-3.0.8.jar:scala-xml_2.13-1.2.0.jar" ExampleSpec.scala
➜ unittest scala -classpath "scalatest-app_2.13-3.0.8.jar:scala-xml_2.13-1.2.0.jar" org.scalatest.run ExampleSpec
Run starting. Expected test count is: 2
ExampleSpec:
A Stack
- should pop values in last-in-first-out order
- should throw NoSuchElementException if an empty stack is popped
Run completed in 506 milliseconds.
Total number of tests run: 2
Suites: completed 1, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
➜ unittest
I have downloaded scala-xml_2.13-1.2.0.jar from this page
Scala 2.13 decoupled scala-xml from the standard library:
The following modules are no longer included in the distribution:
scala-xml, scala-parser-combinators, scala-swing. They are
community-maintained and published to Maven Central.
As suggested by J0HN, download scala-xml_2.13 and try the following command
scala -cp scalatest-app_2.13-3.0.8.jar:scala-xml_2.13-1.2.0.jar org.scalatest.run ExampleSpec

Can't poke MixedVec

I declared a MixedVec in my Module interface:
class WbInterconOneMaster(val awbm: WbMaster,
val awbs: Seq[WbSlave]) extends Module {
val io = IO(new Bundle{
val wbm = Flipped(new WbMaster(awbm.dwidth, awbm.awidth))
val wbs = MixedVec(awbs.map{i => Flipped(new WbSlave(i.dwidth, i.awidth, i.iname))})
})
That compile correctly and Verilog is correctly generated. But I can't poke values on signal like it :
for(wbs <- dut.io.wbs) {
poke(wbs.ack_o, 0)
}
Got this error at execution time (verilator backend):
[info] java.util.NoSuchElementException: head of empty list
[info] at scala.collection.immutable.Nil$.head(List.scala:420)
[info] at scala.collection.immutable.Nil$.head(List.scala:417)
[info] at scala.collection.mutable.Stack.top(Stack.scala:132)
[info] at chisel3.internal.naming.NamingStack.pop_return_context(Namer.scala:133)
[info] at chisel3.util.MixedVec.length(MixedVec.scala:81)
[info] at scala.collection.IndexedSeqLike$class.iterator(IndexedSeqLike.scala:90)
[info] at chisel3.util.MixedVec.iterator(MixedVec.scala:81)
[info] at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
[info] at chisel3.util.MixedVec.foreach(MixedVec.scala:81)
[info] at wbplumbing.TestWbInterconDualSlave.<init>(testwbplumbing.scala:61)
Note that the question was already asked on github project with the chisel version 3.1.6 but it's marked as closed. I'm using 3.1.8 version and it seems to be broken yet.
[edit]
I upgraded my project with chisel 3.2.0 and iotester 1.3.0 :
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.2.0"
libraryDependencies += "edu.berkeley.cs" %% "chisel-iotesters" % "1.3.0"
And I still have an error when I uncomment the line :
for(wbs <- dut.io.wbs) {
poke(wbs.ack_o, 0)
}
(if I left these lines commented, that works)
But the stack trace is different :
[info] - should read and write wishbone value on two slaves *** FAILED ***
[info] chisel3.internal.ChiselException: Error: Not in a RawModule. Likely cause: Missed Module() wrap, bare chisel API call, or attempting to construct hardware inside a BlackBox.
[info] at chisel3.internal.throwException$.apply(Error.scala:42)
[info] at chisel3.internal.Builder$.referenceUserModule(Builder.scala:287)
[info] at chisel3.Data.connect(Data.scala:384)
[info] at chisel3.Data.$colon$eq(Data.scala:475)
[info] at wbplumbing.TestWbInterconDualSlave$$anonfun$15.apply(testwbplumbing.scala:63)
[info] at wbplumbing.TestWbInterconDualSlave$$anonfun$15.apply(testwbplumbing.scala:62)
[info] at scala.collection.Iterator$class.foreach(Iterator.scala:742)
[info] at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
[info] at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
[info] at chisel3.util.MixedVec.foreach(MixedVec.scala:88)
[info] ...
I am not sure what is going on but I was able to reproduce your error in the current chisel3 release, but the same code seems to run properly under the chisel 3.2 release candidate snapshot. Is it possible for you to try your code there.
Hopefully it will work better. The problem does not appear to be directly in MixedVec but must be in underlying code.
I must say that you need to be especially careful when using MixedVec, it is not indexable by a hardware index, so all references to its elements must be referenced from constant Scala ints at elaboration time.

Scalatest runner results in VerifyError: Cannot inherit from final class

I am trying to run all the tests (all of them extend FlatSpec) in a jar using the ScalaTest Runner, but get a message for VerifyError. However, I am able to individually run the tests.
The jar was compiled using sbt test:assembly. I am trying to run the tests in another environment where sbt is not available. The tests are available in the jar in com/tfs/test path, as shown in the output below:
samik#samik-lap:~/git/proj$ jar tf test-2018.2.jar | grep MyTest
com/tfs/test/MyTest$$anonfun$1$$anonfun$apply$mcV$sp$1.class
com/tfs/test/MyTest$$anonfun$4.class
com/tfs/test/MyTest$$anonfun$5.class
com/tfs/test/MyTest$$typecreator4$1.class
com/tfs/test/MyTest$$typecreator5$1.class
com/tfs/test/MyTest$$typecreator9$1.class
com/tfs/test/MyTest.class
The following command runs the specific test just fine:
samik#samik-lap:~/git/proj$ scala -J-Xmx2g -cp "scalatest_2.11-3.0.5.jar:scalactic_2.11-3.0.5.jar:test-2018.2.jar" org.scalatest.run com.tfs.test.MyTest
0 [ScalaTest-main] INFO com.tfs.test.MyTest - Starting MyTest test application
Start Time: 0 sec
...
However, when I use Runner to run all the tests (there are ~7 similar tests in the jar available at the same path), it doesn't work.
samik#samik-lap:~/git/proj$ scala -J-Xmx2g -cp "scalatest_2.11-3.0.5.jar:scalactic_2.11-3.0.5.jar" org.scalatest.tools.Runner -o -R test-2018.2.jar
Discovery starting.
*** RUN ABORTED ***
java.lang.VerifyError: Cannot inherit from final class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
How do I get all the tests to run through Runner? How do I get to know what is happening inside and where the issue is? Thanks for any pointer.

sbt file doesn't recognize the spark input

I try to execute Scala code in Spark. The example of the code and build.sbt file is possible to find here.
I have one difference to this example. I use already the version 2.0.0 of Spark (I have already download this version local and defined path in .bashrc file). Now, I have modified also my build.sbt file and set the version to 2.0.0
After that I have the error message.
Case 1:
I just executed the code of SparMeApp like is given in the link. I got the error message, that I have to define setMaster function.
16/09/05 19:37:01 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
Case 2:
I define setMaster function with different arguments. I have got next error messages:
Input: setMaster("spark://<username>:7077) or setMaster("local[2]")
Error:
[error] (run-main-0) java.lang.ArrayIndexOutOfBoundsException: 0
java.lang.ArrayIndexOutOfBoundsException: 0
(this error means that my string is empty)
In other cases just error: 16/09/05 19:44:29 WARN
StandaloneAppClient$ClientEndpoint: Failed to connect to master <...>
org.apache.spark.SparkException: Exception thrown in awaitResult
Additional I have only a little experience in Scala and in sbt. So probably my sbt is configutred false.... Can somebody, please, tell me the right way?
This is how your minimal build.sbt will look :
name := "SparkMe Project"
version := "1.0"
scalaVersion := "2.11.7"
organization := "pl.japila"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
And here is your SparkMeApp object :
object SparkMeApp{
def main(args: Array[String]) {
val conf = new SparkConf()
.setAppName("SparkMe Application")
.setMaster("local[*]")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
execute it like :
$ sbt "run [your file path]"
#Abhi, thank you very much for your answer. In general it works. Anyway I have some error message after the correct execution of the code.
I have created some test txt file with 4 lines
test file
test file
test file
test file
In the SparkMeApp I have changed the code line to:
val fileName = "/home/usr/test.txt"
After I execute the line run SparkMeApp.scala I got next output:
16/09/06 09:15:34 INFO DAGScheduler: Job 0 finished: count at SparkMeApp.scala:11, took 0.348171 s
There are 4 lines in /home/usr/test.txt
16/09/06 09:15:34 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1229)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:172)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
16/09/06 09:15:34 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
at java.util.concurrent.Semaphore.acquire(Semaphore.java:312)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:67)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:65)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1229)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:64)
16/09/06 09:15:34 INFO SparkUI: Stopped Spark web UI at http://<myip>:4040
[success] Total time: 7 s, completed Sep 6, 2016 9:15:34 AM
I can see the correct output of my code (second line), but after I have got the interrupt error. How I can fix it? Anyway, I do hope the code is worked currently.

java.lang.ClassNotFoundException: Class scala.runtime.Nothing when running the scoobi WordCount example

I'm trying to run the word count example from the Quick start page
import com.nicta.scoobi.Scoobi._
import Reduction._
object WordCount extends ScoobiApp {
def run() {
val lines = fromTextFile(args(0))
val counts = lines.mapFlatten(_.split(" "))
.map(word => (word, 1))
.groupByKey
.combine(Sum.int)
counts.toTextFile(args(1)).persist
}
}
It works fine when I use in memory mode, but when trying local mode (or cluster mode) I fail with the errors:
[WARN] LocalJobRunner - job_local_0001 <java.lang.RuntimeException: java.lang.ClassNotFoundException: Class scala.runtime.Nothing$ not found>java.lang.RuntimeException: java.lang.ClassNotFoundException: Class scala.runtime.Nothing$ not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1439)
at com.nicta.scoobi.impl.mapreducer.ChannelOutputFormat.com$nicta$scoobi$impl$mapreducer$ChannelOutputFormat$$mkTaskContext$1(ChannelOutputFormat.scala:63)
at com.nicta.scoobi.impl.mapreducer.ChannelOutputFormat$$anonfun$getContext$1.apply(ChannelOutputFormat.scala:75)
at com.nicta.scoobi.impl.mapreducer.ChannelOutputFormat$$anonfun$getContext$1.apply(ChannelOutputFormat.scala:75)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
at com.nicta.scoobi.impl.mapreducer.ChannelOutputFormat.getContext(ChannelOutputFormat.scala:75)
at com.nicta.scoobi.impl.mapreducer.ChannelOutputFormat.write(ChannelOutputFormat.scala:43)
at com.nicta.scoobi.impl.plan.mscr.MscrOutputChannel$$anon$5$$anonfun$write$1.apply(OutputChannel.scala:137)
at com.nicta.scoobi.impl.plan.mscr.MscrOutputChannel$$anon$5$$anonfun$write$1.apply(OutputChannel.scala:135)
at scala.collection.immutable.List.foreach(List.scala:318)
at com.nicta.scoobi.impl.plan.mscr.MscrOutputChannel$$anon$5.write(OutputChannel.scala:135)
at com.nicta.scoobi.impl.plan.mscr.GbkOutputChannel$$anonfun$reduce$1.apply$mcV$sp(OutputChannel.scala:201)
at com.nicta.scoobi.impl.plan.mscr.GbkOutputChannel$$anonfun$reduce$1.apply(OutputChannel.scala:201)
at com.nicta.scoobi.impl.plan.mscr.GbkOutputChannel$$anonfun$reduce$1.apply(OutputChannel.scala:201)
at scala.Option.getOrElse(Option.scala:120)
at com.nicta.scoobi.impl.plan.mscr.GbkOutputChannel.reduce(OutputChannel.scala:200)
at com.nicta.scoobi.impl.mapreducer.MscrReducer$$anonfun$reduce$1.apply(MscrReducer.scala:55)
at com.nicta.scoobi.impl.mapreducer.MscrReducer$$anonfun$reduce$1.apply(MscrReducer.scala:52)
at scala.Option.foreach(Option.scala:236)
at com.nicta.scoobi.impl.mapreducer.MscrReducer.reduce(MscrReducer.scala:52)
at com.nicta.scoobi.impl.mapreducer.MscrReducer.reduce(MscrReducer.scala:33)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:572)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:414)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:309)
Caused by: java.lang.ClassNotFoundException: Class scala.runtime.Nothing$ not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1350)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1437)
... 25 more
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/3757970833182018747_-1642337927_156373685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.combiners-step1 of 1
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/1307074498433974065_910223079_156373685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.mappers-step1 of 1
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/-624792843022440048_-470268278_156372685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.metadata.TG23
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/-7527273518266336656_-470264434_156372685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.metadata.TK23
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/-7162952586058180219_-470259629_156372685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.metadata.TP23
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/-1228551315878554095_-470253863_156372685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.metadata.TV23
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/6598684265640022340_1943382592_156373685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/dist-objs/scoobi.reducers-step1 of 1
[INFO] TrackerDistributedCacheManager - Deleted path /tmp/scoobi-root/WordCount$-1124-035523--1298047809/step1 of 1/archive/1699308645513763631_1905624154_156371685/file/tmp/scoobi-root/WordCount$-1124-035523--1298047809/env/a88809af-334b-499e-bafc-1a2ebeffdfbd
[INFO] MapReduceJob - Map 100% Reduce 0%
[error] (run-main) com.nicta.scoobi.impl.exec.JobExecException: MapReduce job 'job_local_0001' failed! Please see http://localhost:8080/ for more info.
com.nicta.scoobi.impl.exec.JobExecException: MapReduce job 'job_local_0001' failed! Please see http://localhost:8080/ for more info.
at com.nicta.scoobi.impl.exec.MapReduceJob.report(MapReduceJob.scala:80)
at com.nicta.scoobi.impl.exec.HadoopMode$Execution$$anonfun$reportMscr$1.apply(HadoopMode.scala:157)
at com.nicta.scoobi.impl.exec.HadoopMode$Execution$$anonfun$reportMscr$1.apply(HadoopMode.scala:154)
at scala.Function2$$anonfun$tupled$1.apply(Function2.scala:54)
at scala.Function2$$anonfun$tupled$1.apply(Function2.scala:53)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at com.nicta.scoobi.impl.exec.HadoopMode$Execution.runMscrs(HadoopMode.scala:133)
at com.nicta.scoobi.impl.exec.HadoopMode$Execution.execute(HadoopMode.scala:115)
at com.nicta.scoobi.impl.exec.HadoopMode$$anonfun$executeLayer$1.apply(HadoopMode.scala:105)
at com.nicta.scoobi.impl.exec.HadoopMode$$anonfun$executeLayer$1.apply(HadoopMode.scala:104)
at org.kiama.attribution.AttributionCore$CachedAttribute.apply(AttributionCore.scala:61)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at com.nicta.scoobi.impl.exec.HadoopMode.com$nicta$scoobi$impl$exec$HadoopMode$$executeLayers$1(HadoopMode.scala:68)
at com.nicta.scoobi.impl.exec.HadoopMode$$anonfun$executeNode$1.apply(HadoopMode.scala:91)
at com.nicta.scoobi.impl.exec.HadoopMode$$anonfun$executeNode$1.apply(HadoopMode.scala:84)
at org.kiama.attribution.AttributionCore$CachedAttribute.apply(AttributionCore.scala:61)
at scalaz.syntax.IdOps$class.$bar$greater(IdOps.scala:15)
at scalaz.syntax.ToIdOps$$anon$1.$bar$greater(IdOps.scala:78)
at com.nicta.scoobi.impl.exec.HadoopMode.execute(HadoopMode.scala:52)
at com.nicta.scoobi.impl.exec.HadoopMode.execute(HadoopMode.scala:48)
at com.nicta.scoobi.impl.Persister.persist(Persister.scala:44)
at com.nicta.scoobi.impl.ScoobiConfigurationImpl.persist(ScoobiConfigurationImpl.scala:355)
at com.nicta.scoobi.application.Persist$class.persist(Persist.scala:33)
at p.WordCount$.persist(scoobi-test.scala:6)
at com.nicta.scoobi.application.Persist$PersistableList.persist(Persist.scala:151)
at p.WordCount$.run(scoobi-test.scala:14)
at com.nicta.scoobi.application.ScoobiApp$$anonfun$main$1.apply$mcV$sp(ScoobiApp.scala:80)
at com.nicta.scoobi.application.ScoobiApp$$anonfun$main$1.apply(ScoobiApp.scala:75)
at com.nicta.scoobi.application.ScoobiApp$$anonfun$main$1.apply(ScoobiApp.scala:75)
at com.nicta.scoobi.application.LocalHadoop$class.runOnLocal(LocalHadoop.scala:41)
at p.WordCount$.runOnLocal(scoobi-test.scala:6)
at com.nicta.scoobi.application.LocalHadoop$class.executeOnLocal(LocalHadoop.scala:35)
at p.WordCount$.executeOnLocal(scoobi-test.scala:6)
at com.nicta.scoobi.application.LocalHadoop$$anonfun$onLocal$1.apply(LocalHadoop.scala:29)
at com.nicta.scoobi.application.InMemoryHadoop$class.withTimer(InMemory.scala:71)
at p.WordCount$.withTimer(scoobi-test.scala:6)
at com.nicta.scoobi.application.InMemoryHadoop$class.showTime(InMemory.scala:79)
at p.WordCount$.showTime(scoobi-test.scala:6)
at com.nicta.scoobi.application.LocalHadoop$class.onLocal(LocalHadoop.scala:29)
at p.WordCount$.onLocal(scoobi-test.scala:6)
at com.nicta.scoobi.application.Hadoop$class.onHadoop(Hadoop.scala:60)
at p.WordCount$.onHadoop(scoobi-test.scala:6)
at com.nicta.scoobi.application.ScoobiApp$class.main(ScoobiApp.scala:75)
at p.WordCount$.main(scoobi-test.scala:6)
at p.WordCount.main(scoobi-test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
[trace] Stack trace suppressed: run last compile:runMain for the full output.
[INFO] Task - Communication exception: java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at org.apache.hadoop.mapred.Task$TaskReporter.run(Task.java:648)
at java.lang.Thread.run(Thread.java:679)
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:runMain for the full output.
[error] (compile:runMain) Nonzero exit code: 1
[error] Total time: 9 s, completed Nov 24, 2013 3:55:30 AM
running the very similar example from github (https://github.com/NICTA/scoobi/tree/SCOOBI-0.7.3/examples/wordCount) does work.
any ideas?
EDIT
I ran the sample according to the explanations in scoobi quick start
running the sample is done using sbt commands:
sbt compile
sbt "run-main mypackage.myapp.WordCount input-files output"
There is no reference regarding how or where to supply parameters such as the location of external jars.
Installed scala version and the one specified in build.sbt must be same. Once this is same, scala-library is auto included into classpath.
We've faced the same issue recently and solved it by adding fork := true to the sbt settings.