Error in Liferay while running portlet - liferay-6

I am trying to run my liferay portlet. All my portlets run successfully but there are two portlets which when I click give the follwing message in the console:
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:206)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:108)
at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:73)
What could possible be the problem?

Related

Unit Tests using Spark Session : SparkContext was shut down

We have a big project with multiple tests suites and every test suite has in average 3 tests.
For our unit tests, we use the Spark Standalone, and so no Yarn as a resource manager.
Every test suite :
initiliazes spark session :
implicit val spark = SparkSession
.builder()
.config(sparkConf)
.getOrCreate()
extends BeforeAndAfterAll:
class MyTestsSpec extends WordSpec
with Matchers
with BeforeAndAfterAll {
...
}
and redefine afterAll :
override def afterAll: Unit = {
try {
spark.stop()
} finally {
super.afterAll
}
}
Our solution has an CI job in Jenkins, and the Jenkins job started to be so often Unstable due to tests that fail because of the following Error :
Message d'erreur
Job 9 cancelled because SparkContext was shut down
Pile d'exécution
org.apache.spark.SparkException: Job 9 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:820)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:818)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1732)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:83)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1651)
at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1921)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1317)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1920)
at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:581)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1954)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:278)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:2853)
at org.apache.spark.sql.Dataset$$anonfun$collect$1.apply(Dataset.scala:2390)
at org.apache.spark.sql.Dataset$$anonfun$collect$1.apply(Dataset.scala:2390)
at org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:2837)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:65)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:2836)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2390)
// some business classes
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.WordSpecLike$$anon$1.apply(WordSpecLike.scala:1078)
at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
at org.scalatest.WordSpec.withFixture(WordSpec.scala:1881)
at org.scalatest.WordSpecLike$class.invokeWithFixture$1(WordSpecLike.scala:1075)
at org.scalatest.WordSpecLike$$anonfun$runTest$1.apply(WordSpecLike.scala:1088)
at org.scalatest.WordSpecLike$$anonfun$runTest$1.apply(WordSpecLike.scala:1088)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
at org.scalatest.WordSpecLike$class.runTest(WordSpecLike.scala:1088)
at org.scalatest.WordSpec.runTest(WordSpec.scala:1881)
at org.scalatest.WordSpecLike$$anonfun$runTests$1.apply(WordSpecLike.scala:1147)
at org.scalatest.WordSpecLike$$anonfun$runTests$1.apply(WordSpecLike.scala:1147)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:373)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:410)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
at org.scalatest.WordSpecLike$class.runTests(WordSpecLike.scala:1147)
at org.scalatest.WordSpec.runTests(WordSpec.scala:1881)
at org.scalatest.Suite$class.run(Suite.scala:1147)
at org.scalatest.WordSpec.org$scalatest$WordSpecLike$$super$run(WordSpec.scala:1881)
at org.scalatest.WordSpecLike$$anonfun$run$1.apply(WordSpecLike.scala:1192)
at org.scalatest.WordSpecLike$$anonfun$run$1.apply(WordSpecLike.scala:1192)
at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
at org.scalatest.WordSpecLike$class.run(WordSpecLike.scala:1192)
// some business classes
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
// some business classes
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
at sbt.TestRunner.runTest$1(TestFramework.scala:106)
at sbt.TestRunner.run(TestFramework.scala:117)
at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.$anonfun$apply$1(TestFramework.scala:262)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:233)
at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.apply(TestFramework.scala:262)
at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.apply(TestFramework.scala:262)
at sbt.TestFunction.apply(TestFramework.scala:271)
at sbt.Tests$.processRunnable$1(Tests.scala:307)
at sbt.Tests$.$anonfun$makeSerial$1(Tests.scala:313)
at sbt.std.Transform$$anon$3.$anonfun$apply$2(System.scala:46)
at sbt.std.Transform$$anon$4.work(System.scala:66)
at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
at sbt.Execute.work(Execute.scala:271)
at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
And when we run the test alone it succeeds, with no problem.
I had a similar issue.
I had multiple tests using Spark and only the first suite worked. I was calling spark.close() in all of them. After removing this call from all suites it worked.
After investigating SparkSession code, my conclusion was that since it is only possible to have one SparkContext per JVM and the tests are run on the same JVM, when you stop it the first time, it becomes unusable for that "JVM session".

Spark Random Forest with SerializationDebugger: Exception in serialization debugger

I start to build some modeling using Spark's mllib, more specifically, the method of Random Forest.
I modified the simple random forest example scala script which can be found here. Here is my modified source code:
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
import org.apache.spark.sql.{Row, SQLContext}
import org.apache.spark.ml.feature.{VectorAssembler,StringIndexer}
import org.apache.spark.mllib.regression.LabeledPoint
// define sqlContext using default settings
val sqlContext = new SQLContext(sc)
// load and parse data
val df = sqlContext.read.format("com.databricks.spark.csv").option("header", "true").option("inferSchema", "true").option("delimiter", " ").load("file:///home/user/sample.dat")
// get the variable id string
val source = scala.io.Source.fromFile("//home/user/sample.variableid.txt")
val variableID = try source.getLines mkString "\n" finally source.close()
// assemble in LabeledPoint RDD format
val assembler = new VectorAssembler().setInputCols(variableID.split(' ')).setOutputCol("features")
val transformed = assembler.transform(df)
val featureDF = transformed.select(col("PHENOTYPE").alias("label"),col("features"))
val data = featureDF.map{r =>
val label = r(0).toString.toDouble
val features = r.getAs[org.apache.spark.mllib.linalg.Vector]("features")
LabeledPoint(label, features)
}
// Split the data into training and test sets (30% held out for testing)
val splits = data.randomSplit(Array(0.7, 0.3))
val (trainingData, testData) = (splits(0), splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val numClasses = 2
val categoricalFeaturesInfo = Map[Int, Int]()
val numTrees = 3 // Use more in practice.
val featureSubsetStrategy = "auto" // Let the algorithm choose.
val impurity = "gini"
val maxDepth = 4
val maxBins = 32
val model = RandomForest.trainClassifier(trainingData, numClasses, categoricalFeaturesInfo,
numTrees, featureSubsetStrategy, impurity, maxDepth, maxBins)
// Evaluate model on test instances and compute test error
val labelAndPreds = testData.map{point =>
val prediction = model.predict(point.features)
(point.label, prediction)
}
val testErr = labelAndPreds.filter(r => r._1 != r._2).count.toDouble/testData.count()
println("Test Error = " + testErr)
println("Learned classification forest model:\n" + model.toDebugString)
I evoked spark shell using
./bin/spark-shell --master local[2] --packages com.databricks:spark-csv_2.11:1.3.0
And run the above script, and have the trained random forest:
res3: org.apache.spark.mllib.tree.model.RandomForestModel =
TreeEnsembleModel classifier with 3 trees
using the trainingData:
res4: org.apache.spark.rdd.RDD[org.apache.spark.mllib.regression.LabeledPoint] = MapPartitionsRDD[13] at randomSplit at <console>:43
The trainingData looks in more Detail:
res5: Array[org.apache.spark.mllib.regression.LabeledPoint] = Array((1.0,[0.0,1.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,2.0,1.0,0.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,1.0,1.0]), (1.0,[0.0,2.0,2.0,0.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,0.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[1.0,2.0,2.0,1.0]), (1.0,[1.0,2.0,0.0,0.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,(4,[1],[2.0])), (1.0,[0.0,2.0,1.0,1.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,1.0,1.0]), (1.0,[0.0,2.0,1.0,0.0]), (1.0,[1.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,2.0,2.0]), (1.0,[0.0,2.0,2.0,1.0]), (1.0,[0.0,2.0,1.0,0.0]), (1.0,[1.0,2.0,2.0,2.0]), (1.0,[0....
However, I ran into the problem with evaluating the trained model:
// Evaluate model on test instances and compute test error
val labelAndPreds = testData.map{point =>
val prediction = model.predict(point.features)
(point.label, prediction)
}
The errors read like:
scala> val labelAndPreds = testData.map{point =>
| val prediction = model.predict(point.features)
| (point.label, prediction)
| }
16/01/25 23:22:39 WARN SerializationDebugger: Exception in serialization debugger
java.io.IOException: Stream Closed
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:272)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:283)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:325)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:154)
at java.io.BufferedReader.read(BufferedReader.java:175)
at scala.io.BufferedSource$$anonfun$iter$1$$anonfun$apply$mcI$sp$1.apply$mcI$sp(BufferedSource.scala:38)
at scala.io.Codec.wrap(Codec.scala:68)
at scala.io.BufferedSource$$anonfun$iter$1.apply(BufferedSource.scala:38)
at scala.io.BufferedSource$$anonfun$iter$1.apply(BufferedSource.scala:38)
at scala.collection.Iterator$$anon$9.next(Iterator.scala:162)
at scala.collection.Iterator$$anon$17.hasNext(Iterator.scala:511)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at scala.io.Source.hasNext(Source.scala:226)
at scala.collection.Iterator$class.toString(Iterator.scala:1153)
at scala.io.Source.toString(Source.scala:178)
at java.lang.String.valueOf(String.java:2847)
at java.lang.StringBuilder.append(StringBuilder.java:128)
at scala.StringContext.standardInterpolator(StringContext.scala:122)
at scala.StringContext.s(StringContext.scala:90)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:111)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:205)
at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:107)
at org.apache.spark.serializer.SerializationDebugger$.find(SerializationDebugger.scala:66)
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:41)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2032)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:318)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:317)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
at org.apache.spark.rdd.RDD.map(RDD.scala:317)
at $line42.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
at $line42.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
at $line42.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
at $line42.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
at $line42.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
at $line42.$read$$iwC$$iwC$$iwC.<init>(<console>:61)
at $line42.$read$$iwC$$iwC.<init>(<console>:63)
at $line42.$read$$iwC.<init>(<console>:65)
at $line42.$read.<init>(<console>:67)
at $line42.$read$.<init>(<console>:71)
at $line42.$read$.<clinit>(<console>)
at $line42.$eval$.<init>(<console>:7)
at $line42.$eval$.<clinit>(<console>)
at $line42.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2032)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:318)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:317)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
at org.apache.spark.rdd.RDD.map(RDD.scala:317)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
at $iwC$$iwC$$iwC.<init>(<console>:61)
at $iwC$$iwC.<init>(<console>:63)
at $iwC.<init>(<console>:65)
at <init>(<console>:67)
at .<init>(<console>:71)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.NotSerializableException: scala.io.BufferedSource
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
... 61 more
I compared of my script with the example script, and found the difference could be original from the traningData/testData format. In the example script, "libsvm" dataFormat is used; while I created the labeledPoint trainingData/testData using VectorAssembler.
Could someone here point out whether the error I encountered linked with the data format? Or other source of bugs?

Scala/Spark Compilation on IntelliJ IDEA 14

I have a simple scala app on IntelliJ but that just prints a line but whenever I try to compile it I'm getting
Error:scalac: Error: org.jetbrains.jps.incremental.scala.remote.ServerException
Error compiling sbt component 'compiler-interface-2.10.0-52.0'
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:145)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:142)
at sbt.IO$.withTemporaryDirectory(IO.scala:285)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:142)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)
at sbt.IO$.withTemporaryDirectory(IO.scala:285)
at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)
at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:33)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$.org$jetbrains$jps$incremental$scala$local$CompilerFactoryImpl$$getOrCompileInterfaceJar(CompilerFactoryImpl.scala:87)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:44)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:43)
at scala.Option.map(Option.scala:145)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.getScalac(CompilerFactoryImpl.scala:43)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.createCompiler(CompilerFactoryImpl.scala:22)
at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)
at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)
at org.jetbrains.jps.incremental.scala.local.Cache$$anonfun$getOrUpdate$2.apply(Cache.scala:20)
at scala.Option.getOrElse(Option.scala:120)
at org.jetbrains.jps.incremental.scala.local.Cache.getOrUpdate(Cache.scala:19)
at org.jetbrains.jps.incremental.scala.local.CachingFactory.createCompiler(CachingFactory.scala:23)
at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:22)
at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:62)
at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:20)
at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
I'm using Scala 2.10 and Spark 1.2.0 is listed as a dependency.
Anybody have any idea how I can get this to compile? Thanks!

SORM class cast exception

I'm trying to migrate my Play framework app from slick to SORM. Created basic classes, but cannot get it working.
case class UserAction(url: String, request: Map[String,String], ts: DateTime)
val ts = DateTime.now()
val act = Db.save(UserAction(request.uri, request.headers.toSimpleMap, ts))
But getting following exception:
play.api.Application$$anon$1: Execution exception[[ClassCastException: __wrapper$1$cf01f4410488416a9990d12622889a84.__wrapper$1$cf01f4410488416a9990d12622889a84$PersistedAnonymous31$1 cannot be cast to models.sorm.UserAction]]
at play.api.Application$class.handleError(Application.scala:293) ~[play_2.10.jar:2.2.1]
at play.api.DefaultApplication.handleError(Application.scala:399) [play_2.10.jar:2.2.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$2$$anonfun$applyOrElse$3.apply(PlayDefaultUpstreamHandler.scala:261) [play_2.10.jar:2.2.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$2$$anonfun$applyOrElse$3.apply(PlayDefaultUpstreamHandler.scala:261) [play_2.10.jar:2.2.1]
at scala.Option.map(Option.scala:145) [scala-library.jar:na]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$2.applyOrElse(PlayDefaultUpstreamHandler.scala:261) [play_2.10.jar:2.2.1]
Caused by: java.lang.ClassCastException: __wrapper$1$cf01f4410488416a9990d12622889a84.__wrapper$1$cf01f4410488416a9990d12622889a84$PersistedAnonymous31$1 cannot be cast to models.sorm.UserAction
at Global$$anonfun$doFilter$1.apply(Global.scala:21) ~[classes/:na]
at Global$$anonfun$doFilter$1.apply(Global.scala:15) ~[classes/:na]
at play.api.mvc.EssentialAction$$anon$2.apply(Action.scala:61) ~[play_2.10.jar:2.2.1]
at play.api.mvc.EssentialAction$$anon$2.apply(Action.scala:60) ~[play_2.10.jar:2.2.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$12.apply(PlayDefaultUpstreamHandler.scala:162) ~[play_2.10.jar:2.2.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$12.apply(PlayDefaultUpstreamHandler.scala:160) ~[play_2.10.jar:2.2.1]
What do I do wrong?

Testing Akka actors that mixin Stash with TestActorRef

I'm running into a problem with an actor that extends Stash and which works perfectly fine when instantiating it with actorOf in a simple ActorSystem. Now I would actually like to write some tests for my stashing actors before using them in my program. But I cannot figure out a way to use an TestActorRef with this actor in my test suite.
The code that works looks like this:
import akka.actor.{Stash, Actor, ActorSystem, Props}
import com.typesafe.config.ConfigFactory
object StashTest {
val config = ConfigFactory.parseString(
"""
|akka.actor.default-mailbox {
| mailbox-type = "akka.dispatch.UnboundedDequeBasedMailbox"
|}
""".stripMargin)
}
class StashTestActor extends Stash {
def receive: Actor.Receive = {
case "unstash" =>
unstashAll()
context become print
case msg => stash()
}
def print: Actor.Receive = {
case msg => println(s"Unstashed message: $msg")
}
}
val system = ActorSystem("stashSystem", StashTest.config)
val ref = system.actorOf(Props[StashTestActor])
ref ! "stash me"
ref ! "blah"
ref ! "unstash"
Which prints
Unstashed message: stash me
Unstashed message: blah
But if I try to write a WordSpec test for this actor, it leaves me with some nasty exceptions I can't figure out what they would like me to change in my code.
The test class looks like this
import akka.testkit.{TestActorRef, TestKit}
import akka.actor.{Stash, Actor, ActorSystem}
import org.scalatest.{WordSpecLike, MustMatchers}
import com.typesafe.config.ConfigFactory
class StashTestActor extends Stash {
def receive: Actor.Receive = {
case "unstash" =>
unstashAll()
context become print
case msg => stash()
}
def print: Actor.Receive = {
case msg => println(s"Unstashed message: $msg")
}
}
class StashTest extends TestKit(ActorSystem("testSystem", StashTest.config))
with WordSpecLike
with MustMatchers {
"A simple stashing actor" must {
val actorRef = TestActorRef[StashTestActor]
"stash messages" in {
actorRef ! "stash me!"
}
"unstash all messages" in {
actorRef ! "unstash"
}
}
}
object StashTest {
val config = ConfigFactory.parseString(
"""
|akka.actor.default-mailbox {
| mailbox-type = "akka.dispatch.UnboundedDequeBasedMailbox"
|}
""".stripMargin)
}
When running the test, I get following exceptions that are thrown during the instantiation of the TestActorRef.
[ERROR] [08/20/2013 14:19:40.765] [testSystem-akka.actor.default-dispatcher-3] [akka://testSystem/user/$$a] Could not instantiate Actor
Make sure Actor is NOT defined inside a class/trait,
if so put it outside the class/trait, f.e. in a companion object,
OR try to change: 'actorOf(Props[MyActor]' to 'actorOf(Props(new MyActor)'.
akka.actor.ActorInitializationException: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:218)
at akka.actor.ActorCell.create(ActorCell.scala:578)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:425)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
at akka.testkit.CallingThreadDispatcher.process$1(CallingThreadDispatcher.scala:244)
at akka.testkit.CallingThreadDispatcher.runQueue(CallingThreadDispatcher.scala:284)
at akka.testkit.CallingThreadDispatcher.register(CallingThreadDispatcher.scala:153)
at akka.dispatch.MessageDispatcher.attach(AbstractDispatcher.scala:133)
at akka.actor.dungeon.Dispatch$class.start(Dispatch.scala:84)
at akka.actor.ActorCell.start(ActorCell.scala:338)
at akka.testkit.TestActorRef.<init>(TestActorRef.scala:50)
at akka.testkit.TestActorRef$.apply(TestActorRef.scala:141)
at akka.testkit.TestActorRef$.apply(TestActorRef.scala:137)
at akka.testkit.TestActorRef$.apply(TestActorRef.scala:146)
at akka.testkit.TestActorRef$.apply(TestActorRef.scala:144)
at stashActorTest.StashTest$$anonfun$1.apply$mcV$sp(StashTestActor.scala:29)
at stashActorTest.StashTest$$anonfun$1.apply(StashTestActor.scala:28)
at stashActorTest.StashTest$$anonfun$1.apply(StashTestActor.scala:28)
at org.scalatest.SuperEngine.registerNestedBranch(Engine.scala:613)
at org.scalatest.WordSpecLike$class.org$scalatest$WordSpecLike$$registerBranch(WordSpecLike.scala:120)
at org.scalatest.WordSpecLike$$anon$2.apply(WordSpecLike.scala:851)
at org.scalatest.words.MustVerb$StringMustWrapperForVerb$class.must(MustVerb.scala:189)
at org.scalatest.matchers.MustMatchers$StringMustWrapper.must(MustMatchers.scala:6167)
at stashActorTest.StashTest.<init>(StashTestActor.scala:28)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:444)
at sbt.TestRunner.runTest$1(TestFramework.scala:84)
at sbt.TestRunner.run(TestFramework.scala:94)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:212)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFunction.apply(TestFramework.scala:229)
at sbt.Tests$$anonfun$7.apply(Tests.scala:196)
at sbt.Tests$$anonfun$7.apply(Tests.scala:196)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: akka.actor.ActorInitializationException: Could not instantiate Actor
Make sure Actor is NOT defined inside a class/trait,
if so put it outside the class/trait, f.e. in a companion object,
OR try to change: 'actorOf(Props[MyActor]' to 'actorOf(Props(new MyActor)'.
at akka.actor.ActorInitializationException$.apply(Actor.scala:218)
at akka.testkit.TestActorRef$$anonfun$apply$2$$anonfun$apply$1.applyOrElse(TestActorRef.scala:148)
at akka.testkit.TestActorRef$$anonfun$apply$2$$anonfun$apply$1.applyOrElse(TestActorRef.scala:147)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Failure.recover(Try.scala:185)
at akka.testkit.TestActorRef$$anonfun$apply$2.apply(TestActorRef.scala:147)
at akka.testkit.TestActorRef$$anonfun$apply$2.apply(TestActorRef.scala:153)
at akka.actor.CreatorFunctionConsumer.produce(Props.scala:369)
at akka.actor.Props.newActor(Props.scala:323)
at akka.actor.ActorCell.newActor(ActorCell.scala:534)
at akka.actor.ActorCell.create(ActorCell.scala:560)
... 58 more
Caused by: java.lang.NullPointerException
at akka.actor.UnrestrictedStash$class.$init$(Stash.scala:82)
at stashActorTest.StashTestActor.<init>(StashTestActor.scala:9)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:161)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
... 64 more
I don't have any problems with using TestActorRefs with actors that don't extend Stash. So I don't know if it is a configuration error or something else I'm missing.
TestActorRef can't be used together with Stash. TestActorRef requires a CallingThreadMailbox and Stash requires DequeBasedMessageQueueSemantics. The documentation should include this limitation and the error messages should be improved.
Using the default akka dispatcher allowed me to use Stash and TestActorRef:
val myActor = TestActorRef[MyActor](Props(classOf[MyActor]).withDispatcher("akka.actor.default-dispatcher"))
Note that this means your tests will no longer use the default CallingThreadDispatcher and will lose the benefits highlighted in the akka docs
I was able to test actors with Stash like this:
val actor = TestActorRef(Props(new MyActorWithStash()).withDispatcher("deque"))