Scala Play 2 Framework: PrivilegedActionException: null - scala

So, i study play 2 framework + slick. code is simple query to db with slick. And get exception. And I don't understand what to do.
my controller:
class IndexController #Inject()(taskRepo: TaskRepo) extends Controller {
def index = Action.async { implicit rs =>
taskRepo.all().map(tasks => Ok(views.html.index(tasks)))
}
}
and exception:
[info] ! #6pp163f7m - Internal server error, for (GET) [/] ->
[info]
[info] play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[PrivilegedActionException: null]]
[info] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:269)
[info] at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:195)
[info] at play.core.server.Server$class.logExceptionAndGetResult$1(Server.scala:45)
[info] at play.core.server.Server$class.getHandlerFor(Server.scala:65)
[info] at play.core.server.NettyServer.getHandlerFor(NettyServer.scala:45)
[info] at play.core.server.netty.PlayRequestHandler.handle(PlayRequestHandler.scala:81)
[info] at play.core.server.netty.PlayRequestHandler.channelRead(PlayRequestHandler.scala:162)
[info] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)
[info] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)
[info] at com.typesafe.netty.http.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:129)
[info] Caused by: java.security.PrivilegedActionException: null
[info] at java.security.AccessController.doPrivileged(Native Method)
[info] at play.runsupport.Reloader$.play$runsupport$Reloader$$withReloaderContextClassLoader(Reloader.scala:39)
[info] at play.runsupport.Reloader.reload(Reloader.scala:336)
[info] at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:118)
[info] at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:116)
[info] at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
[info] at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
[info] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
[info] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
[info] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
[info] Caused by: java.util.concurrent.TimeoutException: Futures timed out after [300000 milliseconds]
[info] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
[info] at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
[info] at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:190)
[info] at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
[info] at scala.concurrent.Await$.result(package.scala:190)
[info] at play.forkrun.ForkRun$$anonfun$askForReload$1.apply(ForkRun.scala:128)
[info] at play.forkrun.ForkRun$$anonfun$askForReload$1.apply(ForkRun.scala:126)
[info] at play.runsupport.Reloader$$anonfun$reload$1.apply(Reloader.scala:338)
[info] at play.runsupport.Reloader$$anon$3.run(Reloader.scala:43)
[info] at java.security.AccessController.doPrivileged(Native Method)
what i do wrong?

Problem was in Futures timed out after [300000 milliseconds]
in build.sbt change fork in run := true to fork in run := false

Related

Spark 3 KryoSerializer issue - Unable to find class: org.apache.spark.util.collection.OpenHashMap

I am upgrading a Spark 2.4 project to Spark 3.x. We are hitting a snag with some existing Spark-ml code:
var stringIndexers = Array[StringIndexer]()
for (featureColumn <- FEATURE_COLS) {
stringIndexers = stringIndexers :+ new StringIndexer().setInputCol(featureColumn).setOutputCol(featureColumn + "_index")
}
val pipeline = new Pipeline().setStages(stringIndexers)
val dfWithNumericalFeatures = pipeline.fit(decoratedDf).transform(decoratedDf)
Specifically, this line: val dfWithNumericalFeatures = pipeline.fit(decoratedDf).transform(decoratedDf) now results in this cryptic exception in Spark 3:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 238.0 failed 1 times, most recent failure: Lost task 0.0 in stage 238.0 (TID 5589) (executor driver): com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.spark.util.collection.OpenHashMap$mcJ$sp$$Lambda$13346/2134122295
[info] Serialization trace:
[info] org$apache$spark$util$collection$OpenHashMap$$grow (org.apache.spark.util.collection.OpenHashMap$mcJ$sp)
[info] at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
[info] at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
[info] at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
[info] at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
[info] at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
[info] at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
[info] at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:396)
[info] at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:307)
[info] at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
[info] at org.apache.spark.serializer.KryoSerializerInstance.deserialize(KryoSerializer.scala:397)
[info] at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.apply(Unknown Source)
[info] at org.apache.spark.sql.execution.aggregate.ComplexTypedAggregateExpression.deserialize(TypedAggregateExpression.scala:271)
[info] at org.apache.spark.sql.catalyst.expressions.aggregate.TypedImperativeAggregate.merge(interfaces.scala:568)
[info] at org.apache.spark.sql.execution.aggregate.AggregationIterator$$anonfun$1.$anonfun$applyOrElse$3(AggregationIterator.scala:199)
[info] at org.apache.spark.sql.execution.aggregate.AggregationIterator$$anonfun$1.$anonfun$applyOrElse$3$adapted(AggregationIterator.scala:199)
[info] at org.apache.spark.sql.execution.aggregate.AggregationIterator.$anonfun$generateProcessRow$7(AggregationIterator.scala:213)
[info] at org.apache.spark.sql.execution.aggregate.AggregationIterator.$anonfun$generateProcessRow$7$adapted(AggregationIterator.scala:207)
[info] at org.apache.spark.sql.execution.aggregate.ObjectAggregationIterator.processInputs(ObjectAggregationIterator.scala:151)
[info] at org.apache.spark.sql.execution.aggregate.ObjectAggregationIterator.<init>(ObjectAggregationIterator.scala:77)
[info] at org.apache.spark.sql.execution.aggregate.ObjectHashAggregateExec.$anonfun$doExecute$2(ObjectHashAggregateExec.scala:107)
[info] at org.apache.spark.sql.execution.aggregate.ObjectHashAggregateExec.$anonfun$doExecute$2$adapted(ObjectHashAggregateExec.scala:85)
[info] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:885)
[info] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:885)
[info] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
[info] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
[info] at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
[info] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
[info] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
[info] at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
[info] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
[info] at org.apache.spark.scheduler.Task.run(Task.scala:131)
[info] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
[info] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
[info] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info] at java.lang.Thread.run(Thread.java:750)
[info] Caused by: java.lang.ClassNotFoundException: org.apache.spark.util.collection.OpenHashMap$mcJ$sp$$Lambda$13346/2134122295
[info] at java.lang.Class.forName0(Native Method)
[info] at java.lang.Class.forName(Class.java:348)
[info] at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
[info] ... 36 more
I have searched around and the only relevant issue I've found is this un-answered SO question with the same issue: Spark Kryo Serialization issue.
OpenHashMap is not used in my code, seems likely that there is a bug with the KryoSerializer during this Pipeline.fit() function. Any ideas how to get around this? Thanks!
EDIT: I also just attempted removing usage of the KryoSerializer during my unit tests:
spark = SparkSession
.builder
.master("local[*]")
.appName("UnitTest")
.config("spark.serializer", "org.apache.spark.serializer.JavaSerializer")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()
Confirmed that I am using the JavaSerializer:
println(spark.conf.get("spark.serializer")) outputs org.apache.spark.serializer.JavaSerializer. Still same issue however, even when not using the KryoSerializer.

Writing a simple benchmark to test scala twirl tempaltes is giving a runtime error

I'm trying this scala microbenchmark plugin, sbt-jmh, and I am getting an error.
package play.twirl.benchmarks
import play.twirl.parser._
import play.twirl.parser.TreeNodes._
import org.openjdk.jmh.annotations.Benchmark
class TwirlBenchmark {
#Benchmark
def simpleParse(): Template = {
val parser = new TwirlParser(false)
val template = "<h1>hello</h1>#someVar"
parser.parse(template) match {
case parser.Success(tmpl, input) =>
if (!input.atEnd) sys.error("Template parsed but not at source end")
tmpl
case parser.Error(_, _, errors) =>
sys.error("Template failed to parse: " + errors.head.str)
}
}
}
It compiles fine, but when running the benchmark:
jmh:run
I get these errors:
[info] # Warmup Iteration 1: <failure>
[info] java.lang.NoClassDefFoundError: scala/util/parsing/input/Position
[info] at play.twirl.benchmarks.TwirlBenchmark.simpleParse(TwirlBenchmarks.scala:23)
[info] at play.twirl.benchmarks.generated.TwirlBenchmark_simpleParse_jmhTest.simpleParse_thrpt_jmhStub(TwirlBenchmark_simpleParse_jmhTest.java:119)
[info] at play.twirl.benchmarks.generated.TwirlBenchmark_simpleParse_jmhTest.simpleParse_Throughput(TwirlBenchmark_simpleParse_jmhTest.java:83)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:453)
[info] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:437)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:745)
[info] Caused by: java.lang.ClassNotFoundException: scala.util.parsing.input.Position
[info] at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[info] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[info] ... 15 more
[info] # Run complete. Total time: 00:00:02
Not sure how to proceed. Help?
You may be missing org.scala-lang.modules:scala-parser-combinators: https://mvnrepository.com/artifact/org.scala-lang.modules/scala-parser-combinators
Be sure to include the appropriate version based on Scala version and Play version

Mockito's mock throw ClassNotFoundException in Spark application

I found that mock object in Mockito would throw ClassNotFoundException when used in Spark. Here is a minimal example:
import org.apache.spark.{SparkConf, SparkContext}
import org.mockito.{Matchers, Mockito}
import org.scalatest.FlatSpec
import org.scalatest.mockito.MockitoSugar
trait MyTrait {
def myMethod(a: Int): Int
}
class MyTraitTest extends FlatSpec with MockitoSugar {
"Mock" should "work in Spark" in {
val m = mock[MyTrait](Mockito.withSettings().serializable())
Mockito.when(m.myMethod(Matchers.any())).thenReturn(1)
val conf = new SparkConf().setAppName("testApp").setMaster("local")
val sc = new SparkContext(conf)
assert(sc.makeRDD(Seq(1, 2, 3)).map(m.myMethod).first() == 1)
}
}
which would throw the following exception:
[info] MyTraitTest:
[info] Mock
[info] - should work in Spark *** FAILED ***
[info] org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.ClassNotFoundException: MyTrait$$EnhancerByMockitoWithCGLIB$$6d9e95a8
[info] at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[info] at java.lang.Class.forName0(Native Method)
[info] at java.lang.Class.forName(Class.java:348)
[info] at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
[info] at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1819)
[info] at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
[info] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1986)
[info] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
[info] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
[info] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
[info] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
[info] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
[info] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
[info] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
[info] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
[info] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
[info] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
[info] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
[info] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
[info] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
[info] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
[info] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
[info] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
[info] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
[info] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
[info] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
[info] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
[info] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
[info] at org.apache.spark.scheduler.Task.run(Task.scala:99)
[info] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:745)
The stacktrace hints this is related to dynamic class loading, but I don't know how to fix it.
Update:
Apparently, change
val m = mock[MyTrait](Mockito.withSettings().serializable())
to
val m = mock[MyTrait](Mockito.withSettings().serializable(SerializableMode.ACROSS_CLASSLOADERS))
makes exception disappear. However I am not following why this fix is necessary. I thought in spark local mode, a single JVM is running that hosts both driver and executor. So it must be a different ClassLoader is used to load the deserialized class on executor?

Scala object throwing build/training error

I need some help understanding errors that are being generated through Scala class for the RandomForestAlgorithm.scala (https://github.com/PredictionIO/PredictionIO/blob/develop/examples/scala-parallel-classification/custom-attributes/src/main/scala/RandomForestAlgorithm.scala).
I am building the project as is (custom-attributes for classification template) in PredictionIO and am getting a pio build error:
hduser#hduser-VirtualBox:~/PredictionIO/classTest$ pio build --verbose
[INFO] [Console$] Using existing engine manifest JSON at /home/hduser/PredictionIO/classTest/manifest.json
[INFO] [Console$] Using command '/home/hduser/PredictionIO/sbt/sbt' at the current working directory to build.
[INFO] [Console$] If the path above is incorrect, this process will fail.
[INFO] [Console$] Uber JAR disabled. Making sure lib/pio-assembly-0.9.5.jar is absent.
[INFO] [Console$] Going to run: /home/hduser/PredictionIO/sbt/sbt package assemblyPackageDependency
[INFO] [Console$] [info] Loading project definition from /home/hduser/PredictionIO/classTest/project
[INFO] [Console$] [info] Set current project to template-scala-parallel-classification (in build file:/home/hduser/PredictionIO/classTest/)
[INFO] [Console$] [info] Compiling 1 Scala source to /home/hduser/PredictionIO/classTest/target/scala-2.10/classes...
[INFO] [Console$] [error] /home/hduser/PredictionIO/classTest/src/main/scala/RandomForestAlgorithm.scala:28: class RandomForestAlgorithm **needs to be abstract**, since method train in class P2LAlgorithm of type (sc: org.apache.spark.SparkContext, pd: com.test1.PreparedData)com.test1.**PIORandomForestModel is not defined**
[INFO] [Console$] [error] class RandomForestAlgorithm(val ap: RandomForestAlgorithmParams) // CHANGED
[INFO] [Console$] [error] ^
[INFO] [Console$] [error] one error found
[INFO] [Console$] [error] (compile:compile) Compilation failed
[INFO] [Console$] [error] Total time: 6 s, completed Jun 8, 2016 4:37:36 PM
[ERROR] [Console$] Return code of previous step is 1. Aborting.
so when I address the line causing the error and make it an abstract object:
// extends P2LAlgorithm because the MLlib's RandomForestModel doesn't
// contain RDD.
abstract class RandomForestAlgorithm(val ap: RandomForestAlgorithmParams) // CHANGED
extends P2LAlgorithm[PreparedData, PIORandomForestModel, // CHANGED
Query, PredictedResult] {
def train(data: PreparedData): PIORandomForestModel = { // CHANGED
// CHANGED
// Empty categoricalFeaturesInfo indicates all features are continuous.
val categoricalFeaturesInfo = Map[Int, Int]()
val m = RandomForest.trainClassifier(
data.labeledPoints,
ap.numClasses,
categoricalFeaturesInfo,
ap.numTrees,
ap.featureSubsetStrategy,
ap.impurity,
ap.maxDepth,
ap.maxBins)
new PIORandomForestModel(
gendersMap = data.gendersMap,
educationMap = data.educationMap,
randomForestModel = m
)
}
pio build is successful but training fails because it can't instantiate the new assignments for the model:
[INFO] [Engine] Extracting datasource params...
[INFO] [WorkflowUtils$] No 'name' is found. Default empty String will be used.
[INFO] [Engine] Datasource params: (,DataSourceParams(6))
[INFO] [Engine] Extracting preparator params...
[INFO] [Engine] Preparator params: (,Empty)
[INFO] [Engine] Extracting serving params...
[INFO] [Engine] Serving params: (,Empty)
[WARN] [Utils] Your hostname, hduser-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address
[INFO] [Remoting] Starting remoting
[INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://sparkDriver#10.0.2.15:59444]
[WARN] [MetricsSystem] Using default name DAGScheduler for source because spark.app.id is not set.
**Exception in thread "main" java.lang.InstantiationException**
at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at io.prediction.core.Doer$.apply(AbstractDoer.scala:52)
at io.prediction.controller.Engine$$anonfun$1.apply(Engine.scala:171)
at io.prediction.controller.Engine$$anonfun$1.apply(Engine.scala:170)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at io.prediction.controller.Engine.train(Engine.scala:170)
at io.prediction.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:65)
at io.prediction.workflow.CreateWorkflow$.main(CreateWorkflow.scala:247)
at io.prediction.workflow.CreateWorkflow.main(CreateWorkflow.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
So two questions:
1. Why is the following model not considered defined during building:
class PIORandomForestModel(
val gendersMap: Map[String, Double],
val educationMap: Map[String, Double],
val randomForestModel: RandomForestModel
) extends Serializable
How can I define PIORandomForestModel in a way that does not throw a pio build error and lets training re-assign attributes to the object?
I have posted this question in the PredictionIO Google group but have not gotten a response.
Thanks in advance for your help.

Testing of click event fails

I am following the official tutorial of scala.js, and in the testing part there is some code like this:
package tutorial.webapp
import utest._
import org.scalajs.jquery.{JQuery, jQuery}
object TutorialTest extends TestSuite {
// Initialize App
TutorialApp.setupUI
def tests = TestSuite {
'HelloWorld {
val ps = jQuery("p:contains('Hello World')")
log("Number of hello-world paragraphs", ps.length)
assert(ps.length == 1)
}
'ButtonClick {
def messageCount =
jQuery("p:contains('Bang!')").length
val button = jQuery("button:contains('Click me!')")
log("Number of bang paragraphs", messageCount)
log("Number of button", button.length)
assert(button.length == 1)
assert(messageCount == 0)
for (c <- 1 to 5) {
log("Number of bang paragraphs", messageCount)
log("Number of button", button.length)
button.click()
assert(messageCount == c)
}
}
def log[A](header: String, msg: A): Unit = {
println(
"""
|
|
|=========================================
|%s : %s
|=========================================
|
|
""".format(header, msg.toString).stripMargin)
}
}
The event-based tests in the for loop fail in sbt test:
> test
[info] Compiling 1 Scala source to /Users/kaiyin/personal_config_bin_files/workspace/scalajsLearn/target/scala-2.11/test-classes...
=========================================
Number of hello-world paragraphs : 1
=========================================
=========================================
Number of bang paragraphs : 0
=========================================
=========================================
Number of button : 1
=========================================
=========================================
Number of bang paragraphs : 0
=========================================
=========================================
Number of button : 1
=========================================
[info] 1/3 tutorial.webapp.TutorialTest.HelloWorld Success
[info] 2/3 tutorial.webapp.TutorialTest.ButtonClick
[info] utest.AssertionError: assert(messageCount == c)
[info] c: Int = 1
[info] 3/3 tutorial.webapp.TutorialTest Success
[info] -----------------------------------Results-----------------------------------
[info] tutorial.webapp.TutorialTest Success
[info] HelloWorld Success
[info] ButtonClick Failure('utest.AssertionError: assert(messageCount == c)
[info] c: Int = 1')
[info] Failures:
[info] 2/3 tutorial.webapp.TutorialTest.ButtonClick
[info] utest.AssertionError: assert(messageCount == c)
[info] c: Int = 1
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/StackTrace.scala:39:42)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/StackTrace.scala:33:17)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/javalanglib/src/main/scala/java/lang/Throwables.scala:24:50)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/javalanglib/src/main/scala/java/lang/Throwables.scala:12:19)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Errors.scala:20:45)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/asserts/package.scala:19:5)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/asserts/package.scala:38:56)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:15:38)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/asserts/Asserts.scala:74:22)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:15:38)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/collection/IndexedSeqOptimized.scala:33:24)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/js/WrappedArray.scala:20:13)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/asserts/Asserts.scala:72:16)
[info] (file:/Users/kaiyin/personal_config_bin_files/workspace/scalajsLearn/src/test/scala/tutorial/webapp/TutorialTest.scala:32:15)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/scalalib/overrides-2.11/scala/collection/immutable/Range.scala:160:8)
[info] (file:/Users/kaiyin/personal_config_bin_files/workspace/scalajsLearn/src/test/scala/tutorial/webapp/TutorialTest.scala:28:14)
[info] (file:/Users/kaiyin/personal_config_bin_files/workspace/scalajsLearn/src/test/scala/tutorial/webapp/TutorialTest.scala:12:25)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:166:31)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:164:27)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:78)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:10:30)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:192:16)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:48)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:48)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/Option.scala:158:5)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:93)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:93)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:237:43)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:237:43)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:10:30)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:192:16)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:237:41)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:235:25)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:235:20)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:15:38)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:32:19)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/ExecutionContext.scala:16:21)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:40:25)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:280:61)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:270:28)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:235:16)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:153:9)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:61:37)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:59:11)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:24:20)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:23:15)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/ExecutionContext.scala:16:21)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:31:29)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:492:114)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:107:14)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:82:100)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:81:54)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/Option.scala:158:5)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:81:47)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:83:24)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:15:38)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:251:31)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:249:16)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:15:38)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:32:19)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/ExecutionContext.scala:16:21)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:40:25)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:280:61)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:270:28)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:249:16)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Promise.scala:153:9)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:82:100)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:81:54)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/Option.scala:158:5)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:81:47)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:88:38)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:59:11)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:24:20)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:23:15)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/ExecutionContext.scala:16:21)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/impl/Future.scala:31:29)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/concurrent/Future.scala:492:114)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:107:14)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/framework/Model.scala:133:22)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/package.scala:135:6)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/runner/BaseRunner.scala:70:21)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/runner/BaseRunner.scala:95:35)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:30:68)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/runner/Task.scala:48:19)
[info] (file:/Users/haoyi/Dropbox%20(Personal)/Workspace/utest-new/utest/js/../shared/src/main/scala/utest/runner/Task.scala:31:17)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:104:36)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:104:36)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:10:30)
[info] (https://raw.githubusercontent.com/scala/scala/v2.11.7/src/library/scala/util/Try.scala:192:16)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:104:23)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:36:18)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:31:7)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.5/library/src/main/scala/scala/scalajs/runtime/AnonFunctions.scala:10:30)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:69:7)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/Slave.scala:30:28)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/BridgeBase.scala:34:20)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/BridgeBase.scala:19:14)
[info] (https://raw.githubusercontent.com/scala-js/scala-js/v0.6.0/test-interface/src/main/scala/org/scalajs/testinterface/internal/BridgeBase.scala:19:14)
[info] Tests: 3
[info] Passed: 2
[info] Failed: 1
[error] Failed tests:
[error] tutorial.webapp.TutorialTest
[error] (test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 2 s, completed 27 oct. 2015 21:00:35
The entire project is here: https://github.com/kindlychung/scalajstutorial
The click behavior is defined here:
package tutorial.webapp
/**
* Created by IDEA on 27/10/15.
*/
import scala.scalajs.js.JSApp
import org.scalajs.dom
import dom.document
import scala.scalajs.js.annotation.JSExport
import org.scalajs.jquery.jQuery
object TutorialApp extends JSApp {
def main(): Unit = {
setupUI
setupBehavior
}
def appendPar(targetNode: dom.Node, text: String): Unit = {
val parNode = document.createElement("p")
val textNode = document.createTextNode(text)
parNode.appendChild(textNode)
targetNode.appendChild(parNode)
}
def addClickedMessage(msg: String): Unit = {
jQuery("body").append("<p>%s</p>".format(msg))
}
def setupUI: Unit = {
jQuery("body").append("<p>Hello World!</p>")
jQuery("body").append("""<button id="click-me-button" type="button">Click me!</button>""")
}
def setupBehavior: Unit = {
jQuery("#click-me-button").click(() => addClickedMessage("Bang!"))
}
}
Your button indeed does nothing. Your test does not call setupBehavior, which registers the click event on the button. Therefore, calling click() doesn't do anything. No message is added, and the test correctly reports that there is no message.
Make sure to call setupBehavior to fix your issue.