Why are the compilation errors when loading Scala build files? - scala

sbt 0.12.4 on Windows.
Firstly, I move into the project directory, that is example - a Scala project under sbt. When I run sbt, I get the following errors:
C:\programs\example>sbt
[info] Loading project definition from C:\programs\example\project\project
[info] Updating {file:/C:/programs/example/project/project/}default-2ad7de...
[info] Resolving org.scala-sbt#sbt;0.12.4 ...
[info] Resolving org.scala-sbt#main;0.12.4 ...
[info] Resolving org.scala-sbt#actions;0.12.4 ...
[info] Resolving org.scala-sbt#classpath;0.12.4 ...
[info] Resolving org.scala-sbt#launcher-interface;0.12.4 ...
[info] Resolving org.scala-lang#scala-library;2.9.2 ...
[info] Resolving org.scala-sbt#interface;0.12.4 ...
[info] Resolving org.scala-sbt#io;0.12.4 ...
[info] Resolving org.scala-sbt#control;0.12.4 ...
[info] Resolving org.scala-lang#scala-compiler;2.9.2 ...
[info] Resolving org.scala-sbt#completion;0.12.4 ...
[info] Resolving org.scala-sbt#collections;0.12.4 ...
[info] Resolving jline#jline;1.0 ...
[info] Resolving org.scala-sbt#api;0.12.4 ...
[info] Resolving org.scala-sbt#compiler-integration;0.12.4 ...
[info] Resolving org.scala-sbt#incremental-compiler;0.12.4 ...
[info] Resolving org.scala-sbt#logging;0.12.4 ...
[info] Resolving org.scala-sbt#process;0.12.4 ...
[info] Resolving org.scala-sbt#compile;0.12.4 ...
[info] Resolving org.scala-sbt#persist;0.12.4 ...
[info] Resolving org.scala-tools.sbinary#sbinary_2.9.0;0.4.0 ...
[info] Resolving org.scala-sbt#classfile;0.12.4 ...
[info] Resolving org.scala-sbt#compiler-ivy-integration;0.12.4 ...
[info] Resolving org.scala-sbt#ivy;0.12.4 ...
[info] Resolving org.apache.ivy#ivy;2.3.0-rc1 ...
[info] Resolving com.jcraft#jsch;0.1.46 ...
[info] Resolving commons-httpclient#commons-httpclient;3.1 ...
[info] Resolving commons-logging#commons-logging;1.0.4 ...
[info] Resolving commons-codec#commons-codec;1.2 ...
[info] Resolving org.scala-sbt#run;0.12.4 ...
[info] Resolving org.scala-sbt#task-system;0.12.4 ...
[info] Resolving org.scala-sbt#tasks;0.12.4 ...
[info] Resolving org.scala-sbt#tracking;0.12.4 ...
[info] Resolving org.scala-sbt#cache;0.12.4 ...
[info] Resolving org.scala-sbt#testing;0.12.4 ...
[info] Resolving org.scala-sbt#test-agent;0.12.4 ...
[info] Resolving org.scala-tools.testing#test-interface;0.5 ...
[info] Resolving org.scala-sbt#command;0.12.4 ...
[info] Resolving org.scala-sbt#compiler-interface;0.12.4 ...
[info] Resolving org.scala-sbt#precompiled-2_8_2;0.12.4 ...
[info] Resolving org.scala-sbt#precompiled-2_9_3;0.12.4 ...
[info] Resolving org.scala-sbt#precompiled-2_10_1;0.12.4 ...
[info] Done updating.
[info] Loading project definition from C:\programs\example\project
[info] Updating {file:/C:/programs/example/project/}default-116b7c...
[info] Resolving net.databinder#dispatch-http_2.9.2;0.8.8 ...
[info] Resolving net.databinder#dispatch-core_2.9.2;0.8.8 ...
[info] Resolving org.scala-lang#scala-library;2.9.2 ...
[info] Resolving org.apache.httpcomponents#httpclient;4.1.3 ...
[info] Resolving org.apache.httpcomponents#httpcore;4.1.4 ...
[info] Resolving commons-logging#commons-logging;1.1.1 ...
[info] Resolving commons-codec#commons-codec;1.4 ...
[info] Resolving net.databinder#dispatch-futures_2.9.2;0.8.8 ...
//some mor like the above
[info] Done updating.
[info] Compiling 8 Scala sources to C:\programs\example\project\target\scala-2.9
.2\sbt-0.12\classes...
[error] C:\programs\example\project\ProgFunBuild.scala:190: object Test is not a
value
[error] (argTask, currentProject, baseDirectory, handoutFiles, submitProject
Name, target, projectDetailsMap, compile in Test) map { (args, currentProject, b
asedir, filesFinder, submitProject, targetDir, detailsMap, _) =>
[error]
^
[error] C:\programs\example\project\ProgFunBuild.scala:56: object Test is not a
value
[error] (unmanagedSourceDirectories in Test) <<= (scalaSource in Test)(Seq(_
)),
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:265: object Test is not a
value
[error] (unmanagedSources in Test) <<= (unmanagedSources in Test, scalaSourc
e in Test, projectDetailsMap, currentProject, gradingTestPackages) map { (source
s, srcTestScalaDir, detailsMap, projectName, gradingSrcs) =>
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:265: reassignment to val
[error] (unmanagedSources in Test) <<= (unmanagedSources in Test, scalaSourc
e in Test, projectDetailsMap, currentProject, gradingTestPackages) map { (source
s, srcTestScalaDir, detailsMap, projectName, gradingSrcs) =>
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:288: object Test is not a
value
[error] val setTestPropertiesHook = (test in Test) <<= (test in Test).dependsO
n(setTestProperties)
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:288: reassignment to val
[error] val setTestPropertiesHook = (test in Test) <<= (test in Test).dependsO
n(setTestProperties)
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:304: object Test is not a
value
[error] compile in Test,
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:512: object Test is not a
value
[error] val readTestCompileLog = (compile in Test) <<= (compile in Test) mapR
handleFailure(compileTestFailed)
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:512: reassignment to val
[error] val readTestCompileLog = (compile in Test) <<= (compile in Test) mapR
handleFailure(compileTestFailed)
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:553: object Test is not a
value
[error] (sourceDirectory in Test) <<= (sourceDirectory in (assignmentProject
, Test))
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:553: reassignment to val
[error] (sourceDirectory in Test) <<= (sourceDirectory in (assignmentProject
, Test))
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:561: object Test is not a
value
[error] (unmanagedSources in Test) <<= (unmanagedSources in Test, scalaSourc
e in (assignmentProject, Test), gradingTestPackages in assignmentProject, gradeP
rojectDetails) map { (sources, testSrcScalaDir, gradingSrcs, project) =>
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:561: reassignment to val
[error] (unmanagedSources in Test) <<= (unmanagedSources in Test, scalaSourc
e in (assignmentProject, Test), gradingTestPackages in assignmentProject, gradeP
rojectDetails) map { (sources, testSrcScalaDir, gradingSrcs, project) =>
[error] ^
[error] C:\programs\example\project\ProgFunBuild.scala:570: object Test is not a
value
[error] compile in Test,
[error] ^
[error] 14 errors found
[error] (compile:compile) Compilation failed
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
UPDATE
When I press l at the commmand prompt, the error log is as follows:
[info] Loading project definition from C:\A\example\project\project
[debug] Running task... Cancelable: false, check cycles: false
[debug]
[debug] Initial source changes:
[debug] removed:Set()
[debug] added: Set()
[debug] modified: Set()
[debug] Removed products: Set()
[debug] Modified external sources: Set()
[debug] Modified binary dependencies: Set()
[debug] Initial directly invalidated sources: Set()
[debug]
[debug] Sources indirectly invalidated by:
[debug] product: Set()
[debug] binary dep: Set()
[debug] external source: Set()
[debug] All initially invalidated sources: Set()
[debug] Copy resource mappings:
[debug]
[info] Loading project definition from C:\A\example\project
[debug] Running task... Cancelable: false, check cycles: false
[debug]
[debug] Initial source changes:
[debug] removed:Set()
[debug] added: Set(C:\A\example\project\Settings.scala, C:\A\example\pro
ject\StyleChecker.scala, C:\A\example\project\GradingFeedback.scala, C:\A\exampl
e\project\CourseraHttp.scala, C:\A\example\project\ProgFunBuild.scala, C:\A\exam
ple\project\ScalaTestRunner.scala, C:\A\example\project\RichJsValue.scala, C:\A\
example\project\RecordingLogger.scala)
[debug] modified: Set()
[debug] Removed products: Set()
[debug] Modified external sources: Set()
[debug] Modified binary dependencies: Set()
[debug] Initial directly invalidated sources: Set(C:\A\example\project\Settings.
scala, C:\A\example\project\StyleChecker.scala, C:\A\example\project\GradingFeed
back.scala, C:\A\example\project\CourseraHttp.scala, C:\A\example\project\ProgFu
nBuild.scala, C:\A\example\project\ScalaTestRunner.scala, C:\A\example\project\R
ichJsValue.scala, C:\A\example\project\RecordingLogger.scala)
[debug]
[debug] Sources indirectly invalidated by:
[debug] product: Set()
[debug] binary dep: Set()
[debug] external source: Set()
[debug] All initially invalidated sources: Set(C:\A\example\project\Settings.sca
la, C:\A\example\project\StyleChecker.scala, C:\A\example\project\GradingFeedbac
k.scala, C:\A\example\project\CourseraHttp.scala, C:\A\example\project\ProgFunBu
ild.scala, C:\A\example\project\ScalaTestRunner.scala, C:\A\example\project\Rich
JsValue.scala, C:\A\example\project\RecordingLogger.scala)
[debug] Recompiling all 8 sources: invalidated sources (8) exceeded 50.0% of all
sources
[info] Compiling 8 Scala sources to C:\A\example\project\target\scala-2.9.2\sbt-
0.12\classes...
[debug] Getting compiler-interface from component compiler for Scala 2.9.2
[debug] Getting compiler-interface from component compiler for Scala 2.9.2
[debug] Running cached compiler b206e9, interfacing (CompilerInterface) with Sca
la compiler version 2.9.2
[debug] Calling Scala compiler with arguments (CompilerInterface):
[debug] -deprecation
[debug] -d
[debug] C:\A\example\project\target\scala-2.9.2\sbt-0.12\classes
[debug] -bootclasspath
[debug] C:\Program Files\Java\jre6\lib\resources.jar;C:\Program Files\Ja
va\jre6\lib\rt.jar;C:\Program Files\Java\jre6\lib\sunrsasign.jar;C:\Program File
s\Java\jre6\lib\jsse.jar;C:\Program Files\Java\jre6\lib\jce.jar;C:\Program Files
\Java\jre6\lib\charsets.jar;C:\Program Files\Java\jre6\lib\modules\jdk.boot.jar;
C:\Program Files\Java\jre6\classes;C:\Documents and Settings\User\.sbt\boot\scal
a-2.9.2\lib\scala-library.jar
[debug] -classpath
[debug] C:\A\example\project\target\scala-2.9.2\sbt-0.12\classes;C:\Docu
ments and Settings\User\.ivy2\cache\net.databinder\dispatch-http_2.9.2\jars\disp
atch-http_2.9.2-0.8.8.jar;C:\Documents and Settings\User\.ivy2\cache\net.databin
der\dispatch-core_2.9.2\jars\dispatch-core_2.9.2-0.8.8.jar;C:\Documents and Sett
ings\User\.ivy2\cache\org.apache.httpcomponents\httpclient\jars\httpclient-4.1.3
.jar;C:\Documents and Settings\User\.ivy2\cache\org.apache.httpcomponents\httpco
re\jars\httpcore-4.1.4.jar;C:\Documents and Settings\User\.ivy2\cache\commons-lo
gging\commons-logging\jars\commons-logging-1.1.1.jar;C:\Documents and Settings\U
ser\.ivy2\cache\commons-codec\commons-codec\jars\commons-codec-1.4.jar;C:\Docume
nts and Settings\User\.ivy2\cache\net.databinder\dispatch-futures_2.9.2\jars\dis
patch-futures_2.9.2-0.8.8.jar;C:\Documents and Settings\User\.ivy2\cache\org.sca
lastyle\scalastyle_2.9.1\jars\scalastyle_2.9.1-0.1.3-SNAPSHOT.jar;C:\Documents a
nd Settings\User\.ivy2\cache\org.scalariform\scalariform_2.9.1\jars\scalariform_
2.9.1-0.1.1.jar;C:\Documents and Settings\User\.ivy2\cache\com.github.scopt\scop
t_2.9.1\jars\scopt_2.9.1-2.0.0.jar;C:\Documents and Settings\User\.ivy2\cache\cc
.spray\spray-json_2.9.2\jars\spray-json_2.9.2-1.1.1.jar;C:\Documents and Setting
s\User\.ivy2\cache\org.parboiled\parboiled-scala\jars\parboiled-scala-1.0.2.jar;
C:\Documents and Settings\User\.ivy2\cache\org.parboiled\parboiled-core\jars\par
boiled-core-1.0.2.jar;C:\Documents and Settings\User\.ivy2\cache\org.scalatest\s
calatest_2.9.2\jars\scalatest_2.9.2-1.9.1.jar;C:\Documents and Settings\User\.iv
y2\cache\org.apache.commons\commons-lang3\jars\commons-lang3-3.1.jar;C:\Document
s and Settings\User\.ivy2\cache\scala_2.9.2\sbt_0.12\com.typesafe.sbteclipse\sbt
eclipse-plugin\jars\sbteclipse-plugin-2.1.0.jar;C:\Documents and Settings\User\.
ivy2\cache\scala_2.9.2\sbt_0.12\com.typesafe.sbteclipse\sbteclipse-core\jars\sbt
eclipse-core-2.1.0.jar;C:\Documents and Settings\User\.ivy2\cache\org.scalaz\sca
laz-core_2.9.2\jars\scalaz-core_2.9.2-6.0.4.jar;C:\Documents and Settings\User\.
ivy2\cache\org.scala-sbt\sbt\jars\sbt-0.12.4.jar;C:\Documents and Settings\User\
.ivy2\cache\org.scala-sbt\main\jars\main-0.12.4.jar;C:\Documents and Settings\Us
er\.ivy2\cache\org.scala-sbt\actions\jars\actions-0.12.4.jar;C:\Documents and Se
ttings\User\.ivy2\cache\org.scala-sbt\classpath\jars\classpath-0.12.4.jar;C:\Doc
uments and Settings\User\.ivy2\cache\org.scala-sbt\launcher-interface\jars\launc
her-interface-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sb
t\interface\jars\interface-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache
\org.scala-sbt\io\jars\io-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\
org.scala-sbt\control\jars\control-0.12.4.jar;C:\Documents and Settings\User\.sb
t\boot\scala-2.9.2\lib\scala-compiler.jar;C:\Documents and Settings\User\.ivy2\c
ache\org.scala-sbt\completion\jars\completion-0.12.4.jar;C:\Documents and Settin
gs\User\.ivy2\cache\org.scala-sbt\collections\jars\collections-0.12.4.jar;C:\Doc
uments and Settings\User\.ivy2\cache\jline\jline\jars\jline-1.0.jar;C:\Documents
and Settings\User\.ivy2\cache\org.scala-sbt\api\jars\api-0.12.4.jar;C:\Document
s and Settings\User\.ivy2\cache\org.scala-sbt\compiler-integration\jars\compiler
-integration-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt
\incremental-compiler\jars\incremental-compiler-0.12.4.jar;C:\Documents and Sett
ings\User\.ivy2\cache\org.scala-sbt\logging\jars\logging-0.12.4.jar;C:\Documents
and Settings\User\.ivy2\cache\org.scala-sbt\process\jars\process-0.12.4.jar;C:\
Documents and Settings\User\.ivy2\cache\org.scala-sbt\compile\jars\compile-0.12.
4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\persist\jars\pers
ist-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-tools.sbinar
y\sbinary_2.9.0\jars\sbinary_2.9.0-0.4.0.jar;C:\Documents and Settings\User\.ivy
2\cache\org.scala-sbt\classfile\jars\classfile-0.12.4.jar;C:\Documents and Setti
ngs\User\.ivy2\cache\org.scala-sbt\compiler-ivy-integration\jars\compiler-ivy-in
tegration-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\iv
y\jars\ivy-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.apache.ivy\
ivy\jars\ivy-2.3.0-rc1.jar;C:\Documents and Settings\User\.ivy2\cache\com.jcraft
\jsch\jars\jsch-0.1.46.jar;C:\Documents and Settings\User\.ivy2\cache\commons-ht
tpclient\commons-httpclient\jars\commons-httpclient-3.1.jar;C:\Documents and Set
tings\User\.ivy2\cache\org.scala-sbt\run\jars\run-0.12.4.jar;C:\Documents and Se
ttings\User\.ivy2\cache\org.scala-sbt\task-system\jars\task-system-0.12.4.jar;C:
\Documents and Settings\User\.ivy2\cache\org.scala-sbt\tasks\jars\tasks-0.12.4.j
ar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\tracking\jars\tracki
ng-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\cache\jar
s\cache-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\test
ing\jars\testing-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala
-sbt\test-agent\jars\test-agent-0.12.4.jar;C:\Documents and Settings\User\.ivy2\
cache\org.scala-tools.testing\test-interface\jars\test-interface-0.5.jar;C:\Docu
ments and Settings\User\.ivy2\cache\org.scala-sbt\command\jars\command-0.12.4.ja
r;C:\Documents and Settings\User\.ivy2\cache\org.scala-sbt\compiler-interface\ja
rs\compiler-interface-bin-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\
org.scala-sbt\compiler-interface\jars\compiler-interface-src-0.12.4.jar;C:\Docum
ents and Settings\User\.ivy2\cache\org.scala-sbt\precompiled-2_8_2\jars\compiler
-interface-bin-0.12.4.jar;C:\Documents and Settings\User\.ivy2\cache\org.scala-s
bt\precompiled-2_9_3\jars\compiler-interface-bin-0.12.4.jar;C:\Documents and Set
tings\User\.ivy2\cache\org.scala-sbt\precompiled-2_10_1\jars\compiler-interface-
bin-0.12.4.jar

A pragmatic "solution" that may help to get you running for now: replace all those occurrences of in Test in ProgFunBuild.scala with in sbt.Test and then try again.
Still, something is broken in your environment.

What happens when you run sbt is that it is not compiling your sources but the build definition of sbt. The problem seems to be that Test is not found. I can only vaguely speculate...
Did you touch that file ProgFunBuild in any way?
Did you accidentally place your normal source code in the project sub directory? Perhaps you did that and have an object Test defined in there which shadows sbt's Test scope?
From what directory do you run sbt? You should be in the root folder of the example, not inside the project directory.
You should have the following layout:
project/ProgFunBuild.scala
project/build.properties (optional)
build.sbt (optional)
src/main/scala/... (your sources in there)

Related

Scala object throwing build/training error

I need some help understanding errors that are being generated through Scala class for the RandomForestAlgorithm.scala (https://github.com/PredictionIO/PredictionIO/blob/develop/examples/scala-parallel-classification/custom-attributes/src/main/scala/RandomForestAlgorithm.scala).
I am building the project as is (custom-attributes for classification template) in PredictionIO and am getting a pio build error:
hduser#hduser-VirtualBox:~/PredictionIO/classTest$ pio build --verbose
[INFO] [Console$] Using existing engine manifest JSON at /home/hduser/PredictionIO/classTest/manifest.json
[INFO] [Console$] Using command '/home/hduser/PredictionIO/sbt/sbt' at the current working directory to build.
[INFO] [Console$] If the path above is incorrect, this process will fail.
[INFO] [Console$] Uber JAR disabled. Making sure lib/pio-assembly-0.9.5.jar is absent.
[INFO] [Console$] Going to run: /home/hduser/PredictionIO/sbt/sbt package assemblyPackageDependency
[INFO] [Console$] [info] Loading project definition from /home/hduser/PredictionIO/classTest/project
[INFO] [Console$] [info] Set current project to template-scala-parallel-classification (in build file:/home/hduser/PredictionIO/classTest/)
[INFO] [Console$] [info] Compiling 1 Scala source to /home/hduser/PredictionIO/classTest/target/scala-2.10/classes...
[INFO] [Console$] [error] /home/hduser/PredictionIO/classTest/src/main/scala/RandomForestAlgorithm.scala:28: class RandomForestAlgorithm **needs to be abstract**, since method train in class P2LAlgorithm of type (sc: org.apache.spark.SparkContext, pd: com.test1.PreparedData)com.test1.**PIORandomForestModel is not defined**
[INFO] [Console$] [error] class RandomForestAlgorithm(val ap: RandomForestAlgorithmParams) // CHANGED
[INFO] [Console$] [error] ^
[INFO] [Console$] [error] one error found
[INFO] [Console$] [error] (compile:compile) Compilation failed
[INFO] [Console$] [error] Total time: 6 s, completed Jun 8, 2016 4:37:36 PM
[ERROR] [Console$] Return code of previous step is 1. Aborting.
so when I address the line causing the error and make it an abstract object:
// extends P2LAlgorithm because the MLlib's RandomForestModel doesn't
// contain RDD.
abstract class RandomForestAlgorithm(val ap: RandomForestAlgorithmParams) // CHANGED
extends P2LAlgorithm[PreparedData, PIORandomForestModel, // CHANGED
Query, PredictedResult] {
def train(data: PreparedData): PIORandomForestModel = { // CHANGED
// CHANGED
// Empty categoricalFeaturesInfo indicates all features are continuous.
val categoricalFeaturesInfo = Map[Int, Int]()
val m = RandomForest.trainClassifier(
data.labeledPoints,
ap.numClasses,
categoricalFeaturesInfo,
ap.numTrees,
ap.featureSubsetStrategy,
ap.impurity,
ap.maxDepth,
ap.maxBins)
new PIORandomForestModel(
gendersMap = data.gendersMap,
educationMap = data.educationMap,
randomForestModel = m
)
}
pio build is successful but training fails because it can't instantiate the new assignments for the model:
[INFO] [Engine] Extracting datasource params...
[INFO] [WorkflowUtils$] No 'name' is found. Default empty String will be used.
[INFO] [Engine] Datasource params: (,DataSourceParams(6))
[INFO] [Engine] Extracting preparator params...
[INFO] [Engine] Preparator params: (,Empty)
[INFO] [Engine] Extracting serving params...
[INFO] [Engine] Serving params: (,Empty)
[WARN] [Utils] Your hostname, hduser-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address
[INFO] [Remoting] Starting remoting
[INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://sparkDriver#10.0.2.15:59444]
[WARN] [MetricsSystem] Using default name DAGScheduler for source because spark.app.id is not set.
**Exception in thread "main" java.lang.InstantiationException**
at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at io.prediction.core.Doer$.apply(AbstractDoer.scala:52)
at io.prediction.controller.Engine$$anonfun$1.apply(Engine.scala:171)
at io.prediction.controller.Engine$$anonfun$1.apply(Engine.scala:170)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at io.prediction.controller.Engine.train(Engine.scala:170)
at io.prediction.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:65)
at io.prediction.workflow.CreateWorkflow$.main(CreateWorkflow.scala:247)
at io.prediction.workflow.CreateWorkflow.main(CreateWorkflow.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
So two questions:
1. Why is the following model not considered defined during building:
class PIORandomForestModel(
val gendersMap: Map[String, Double],
val educationMap: Map[String, Double],
val randomForestModel: RandomForestModel
) extends Serializable
How can I define PIORandomForestModel in a way that does not throw a pio build error and lets training re-assign attributes to the object?
I have posted this question in the PredictionIO Google group but have not gotten a response.
Thanks in advance for your help.

Scala Play 2 Framework: PrivilegedActionException: null

So, i study play 2 framework + slick. code is simple query to db with slick. And get exception. And I don't understand what to do.
my controller:
class IndexController #Inject()(taskRepo: TaskRepo) extends Controller {
def index = Action.async { implicit rs =>
taskRepo.all().map(tasks => Ok(views.html.index(tasks)))
}
}
and exception:
[info] ! #6pp163f7m - Internal server error, for (GET) [/] ->
[info]
[info] play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[PrivilegedActionException: null]]
[info] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:269)
[info] at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:195)
[info] at play.core.server.Server$class.logExceptionAndGetResult$1(Server.scala:45)
[info] at play.core.server.Server$class.getHandlerFor(Server.scala:65)
[info] at play.core.server.NettyServer.getHandlerFor(NettyServer.scala:45)
[info] at play.core.server.netty.PlayRequestHandler.handle(PlayRequestHandler.scala:81)
[info] at play.core.server.netty.PlayRequestHandler.channelRead(PlayRequestHandler.scala:162)
[info] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)
[info] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)
[info] at com.typesafe.netty.http.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:129)
[info] Caused by: java.security.PrivilegedActionException: null
[info] at java.security.AccessController.doPrivileged(Native Method)
[info] at play.runsupport.Reloader$.play$runsupport$Reloader$$withReloaderContextClassLoader(Reloader.scala:39)
[info] at play.runsupport.Reloader.reload(Reloader.scala:336)
[info] at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:118)
[info] at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:116)
[info] at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
[info] at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
[info] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
[info] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
[info] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
[info] Caused by: java.util.concurrent.TimeoutException: Futures timed out after [300000 milliseconds]
[info] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
[info] at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
[info] at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:190)
[info] at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
[info] at scala.concurrent.Await$.result(package.scala:190)
[info] at play.forkrun.ForkRun$$anonfun$askForReload$1.apply(ForkRun.scala:128)
[info] at play.forkrun.ForkRun$$anonfun$askForReload$1.apply(ForkRun.scala:126)
[info] at play.runsupport.Reloader$$anonfun$reload$1.apply(Reloader.scala:338)
[info] at play.runsupport.Reloader$$anon$3.run(Reloader.scala:43)
[info] at java.security.AccessController.doPrivileged(Native Method)
what i do wrong?
Problem was in Futures timed out after [300000 milliseconds]
in build.sbt change fork in run := true to fork in run := false

Unresolved dependency: com.typesafe.akka#akka-actor_2.11;2.3.14

I am trying to build multi-project and this is what my code looks like
import sbt._
import Keys._
object ProjectBuild extends Build {
lazy val commonSettings = Seq(
organization := "com.learner",
version := "0.1.0",
scalaVersion := "2.11.7",
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/",
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.14",
"com.typesafe.akka" %% "akka-cluster" % "2.3.14"
)
)
lazy val cluster_simple = project
.settings(commonSettings: _*)
}
when I run SBT I get error as
> reload
[info] Loading project definition from /Users/harit/IdeaProjects/libs/akka-cluster-investigation/project
compile
[info] Set current project to akka-cluster-investigation (in build file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/)
> compile
[info] Updating {file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/}cluster_simple...
[info] Updating {file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/}akka-cluster-investigation...
[info] Resolving jline#jline;2.12.1 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.typesafe.akka#akka-actor_2.11;2.3.14: configuration not found in com.typesafe.akka#akka-actor_2.11;2.3.14: 'master(compile)'. Missing configuration: 'compile'. It was required from com.typesafe.akka#akka-remote_2.11;2.3.14 compile
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] com.typesafe.akka:akka-actor_2.11:2.3.14 (/Users/harit/IdeaProjects/libs/akka-cluster-investigation/project/ProjectBuild.scala#L10)
[warn] +- com.typesafe.akka:akka-remote_2.11:2.3.14
[warn] +- com.typesafe.akka:akka-cluster_2.11:2.3.14 (/Users/harit/IdeaProjects/libs/akka-cluster-investigation/project/ProjectBuild.scala#L10)
[warn] +- com.learner:cluster_simple_2.11:0.1.0
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[trace] Stack trace suppressed: run last cluster_simple/*:update for the full output.
[error] (cluster_simple/*:update) sbt.ResolveException: unresolved dependency: com.typesafe.akka#akka-actor_2.11;2.3.14: configuration not found in com.typesafe.akka#akka-actor_2.11;2.3.14: 'master(compile)'. Missing configuration: 'compile'. It was required from com.typesafe.akka#akka-remote_2.11;2.3.14 compile
[error] Total time: 0 s, completed Oct 17, 2015 12:41:00 PM
>
What am I missing?
I removed the ~/.ivy2 folder and ran again
$ sbt clean compile
[info] Loading project definition from /Users/harit/IdeaProjects/libs/akka-cluster-investigation/project
[info] Set current project to akka-cluster-investigation (in build file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/)
[success] Total time: 0 s, completed Oct 17, 2015 6:45:10 PM
[info] Updating {file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/}akka-cluster-investigation...
[info] Updating {file:/Users/harit/IdeaProjects/libs/akka-cluster-investigation/}cluster_simple...
[info] Resolving org.sonatype.oss#oss-parent;7 ...
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.11.7/scala-library-2.11.7.jar ...
[info] [SUCCESSFUL ] org.scala-lang#scala-library;2.11.7!scala-library.jar (1958ms)
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/scala-compiler/2.11.7/scala-compiler-2.11.7.jar ...
[info] [SUCCESSFUL ] org.scala-lang#scala-compiler;2.11.7!scala-compiler.jar (11051ms)
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.11.7/scala-reflect-2.11.7.jar ...
[info] [SUCCESSFUL ] org.scala-lang#scala-reflect;2.11.7!scala-reflect.jar (3520ms)
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.11/1.0.4/scala-xml_2.11-1.0.4.jar ...
[info] [SUCCESSFUL ] org.scala-lang.modules#scala-xml_2.11;1.0.4!scala-xml_2.11.jar(bundle) (634ms)
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar ...
[info] [SUCCESSFUL ] org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4!scala-parser-combinators_2.11.jar(bundle) (455ms)
[info] downloading https://repo1.maven.org/maven2/jline/jline/2.12.1/jline-2.12.1.jar ...
[info] [SUCCESSFUL ] jline#jline;2.12.1!jline.jar (333ms)
[info] Done updating.
[info] Resolving jline#jline;2.12.1 ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-actor_2.11/2.3.14/akka-actor_2.11-2.3.14.jar ...
[info] [SUCCESSFUL ] com.typesafe.akka#akka-actor_2.11;2.3.14!akka-actor_2.11.jar (3241ms)
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-cluster_2.11/2.3.14/akka-cluster_2.11-2.3.14.jar ...
[info] [SUCCESSFUL ] com.typesafe.akka#akka-cluster_2.11;2.3.14!akka-cluster_2.11.jar (930ms)
[info] downloading https://repo1.maven.org/maven2/com/typesafe/config/1.2.1/config-1.2.1.jar ...
[info] [SUCCESSFUL ] com.typesafe#config;1.2.1!config.jar(bundle) (267ms)
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-remote_2.11/2.3.14/akka-remote_2.11-2.3.14.jar ...
[info] [SUCCESSFUL ] com.typesafe.akka#akka-remote_2.11;2.3.14!akka-remote_2.11.jar (1571ms)
[info] downloading https://repo1.maven.org/maven2/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar ...
[info] [SUCCESSFUL ] io.netty#netty;3.8.0.Final!netty.jar(bundle) (1249ms)
[info] downloading https://repo1.maven.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar ...
[info] [SUCCESSFUL ] com.google.protobuf#protobuf-java;2.5.0!protobuf-java.jar(bundle) (542ms)
[info] downloading https://repo1.maven.org/maven2/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar ...
[info] [SUCCESSFUL ] org.uncommons.maths#uncommons-maths;1.2.2a!uncommons-maths.jar (140ms)
[info] Done updating.
[success] Total time: 30 s, completed Oct 17, 2015 6:45:40 PM

In sbt, how do you override scalacOptions for console in all configurations?

I like defining scalacOptions at the top level like so (as an example, ignoring project axis for now):
scalacOptions += "-Ywarn-unused-import"
But then I realised that's too strict for console. So I tried setting:
scalacOptions in console ~= (_ filterNot (_ == "-Ywarn-unused-import"))
But that didn't work (still got (fatal) warnings in the REPL).
I used inspect to try and understand why:
> inspect console
[info] Task: Unit
[info] Description:
[info] Starts the Scala interpreter with the project classes on the classpath.
[info] Provided by:
[info] {file:/a/}b/compile:console
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:261
[info] Dependencies:
[info] compile:console::compilers
[info] compile:console::initialCommands
[info] compile:console::fullClasspath
[info] compile:console::taskTemporaryDirectory
[info] compile:console::scalaInstance
[info] compile:console::streams
[info] compile:console::cleanupCommands
[info] compile:console::scalacOptions
[info] Delegates:
[info] compile:console
[info] *:console
[info] {.}/compile:console
[info] {.}/*:console
[info] */compile:console
[info] */*:console
[info] Related:
[info] test:console
Note: console is
provided by compile:console
depends on compile:console::scalacOptions
then:
> inspect compile:console::scalacOptions
[info] Task: scala.collection.Seq[java.lang.String]
[info] Description:
[info] Options for the Scala compiler.
[info] Provided by:
[info] {file:/a/}b/compile:scalacOptions
[info] Defined at:
[info] (sbt.Classpaths) Defaults.scala:1593
[info] Reverse dependencies:
[info] compile:console
[info] Delegates:
[info] compile:console::scalacOptions
[info] compile:scalacOptions
[info] *:console::scalacOptions
[info] *:scalacOptions
[info] {.}/compile:console::scalacOptions
[info] {.}/compile:scalacOptions
[info] {.}/*:console::scalacOptions
[info] {.}/*:scalacOptions
[info] */compile:console::scalacOptions
[info] */compile:scalacOptions
[info] */*:console::scalacOptions
[info] */*:scalacOptions
[info] Related:
[info] *:console::scalacOptions
[info] compile:scalacOptions
[info] *:scalacOptions
[info] */*:scalacOptions
[info] test:scalacOptions
Note: compile:console::scalacOptions is
provided by compile:scalacOptions
doesn't reach *:console::scalacOptions (which is what I defined) in the delegation chain
My question is how do I override scalacOptions for all configurations for console? Is it possible to change the delegation chain?
I'd like to avoid having to set scalacOptions in (Compile, console) (as it would be duplicated for (Test, console)) or define a val of scalac options.
My question is how do I override scalacOptions for all configurations for console?
I don't think we can given the presence of compile:scalacOptions provided by sbt's Defaults. The only scope that has higher precedence is compile:console::scalacOptions.
In most cases one would not want Compile and Test settings to cross wire, so configuration scoping higher precedence I don't think is a bad default.
lazy val commonSettings = Seq(
scalaVersion := "2.11.4",
scalacOptions += "-Ywarn-unused-import",
scalacOptions in (Compile, console) ~= (_ filterNot (_ == "-Ywarn-unused-import")),
scalacOptions in (Test, console) := (scalacOptions in (Compile, console)).value
)
Is it possible to change the delegation chain?
No, this is not possible.
There's a single instance of delegates function in BuildStructure, and it's initialized at the loading time and used for all tasks.
The ordering is done in Scope.delegates.
I fix the bad scalac options in an autoplugin:
package console
import sbt._
/** [[FixScalacOptionsInConsole]] is an [[AutoPlugin]] that removes
* noisy or unnecessary scalac options when running an sbt console.
*/
object FixScalacOptionsInConsole extends AutoPlugin {
import Keys._
override def requires = plugins.JvmPlugin
override def trigger = allRequirements
override lazy val projectSettings = Seq(
Compile / console / scalacOptions ~= filter,
Test / console / scalacOptions ~= filter
)
def filter: Seq[String] => Seq[String] =
_ .filterNot(_ == "-feature")
.filterNot(_.startsWith("-opt:"))
.filterNot(_ == "-unchecked")
.filterNot(_.startsWith("-Xlint:"))
.filterNot(_ == "-Xfatal-warnings")
.filterNot(_.startsWith("-Ywarn"))
}

sbt - defining custom configuration for sequential tests

Some of my tests require that they are sequentially executed. I read about custom configurations on http://www.scala-sbt.org/0.13.0/docs/Detailed-Topics/Testing.html but I am missing something because my configuration is not working properly.
Here it is:
import sbt._
import Keys._
object SchedulingBackendBuild extends Build {
lazy val SequentialTest = config("sequentialTest") extend(Test)
def sequentialTestFilter(name: String): Boolean = {
println("===seq test filter")
name endsWith "SeqSpec"
}
def unitTestFilter(name: String): Boolean = {
println("===unit test filter")
!sequentialTestFilter(name)
}
lazy val root = Project(id = "scheduling-backend",
base = file("."),
settings = Project.defaultSettings
).configs(SequentialTest)
.settings(inConfig(SequentialTest)(Defaults.testTasks): _*)
.settings(
testOptions in Test ++= Seq(Tests.Filter(unitTestFilter)),
testOptions in SequentialTest ++= Seq(Tests.Filter(sequentialTestFilter))
)
}
I want test to only execute tests that are not ending with SeqSpec and this is working, but when I try to execute sequentialTest:test no tests are executed. I added println to my filters and I can see that even if I execute sequentialTest:test I am getting
===unit test filter
===seq test filter
===seq test filter
so both filters are executed.
When I type inspect sequentialTest:testOptions I am getting
[info] Task: scala.collection.Seq[sbt.TestOption]
[info] Description:
[info] Options for running tests.
[info] Provided by:
[info] {file:/path/to/project/scheduling-backend/}scheduling-backend/sequentialTest:testOptions
[info] Defined at:
[info] /path/to/project/scheduling-backend/project/Build.scala:22
[info] Reverse dependencies:
[info] sequentialTest:testOnly::testOptions
[info] sequentialTest:testQuick::testOptions
[info] sequentialTest:test::testOptions
[info] Delegates:
[info] sequentialTest:testOptions
[info] test:testOptions
[info] runtime:testOptions
[info] compile:testOptions
[info] *:testOptions
[info] {.}/sequentialTest:testOptions
[info] {.}/test:testOptions
[info] {.}/runtime:testOptions
[info] {.}/compile:testOptions
[info] {.}/*:testOptions
[info] */sequentialTest:testOptions
[info] */test:testOptions
[info] */runtime:testOptions
[info] */compile:testOptions
[info] */*:testOptions
[info] Related:
[info] sequentialTest:testOnly::testOptions
[info] sequentialTest:testQuick::testOptions
[info] test:testOptions
[info] test:testQuick::testOptions
[info] */*:testOptions
[info] test:testOnly::testOptions
[info] test:test::testOptions
[info] sequentialTest:test::testOptions
so for me it looks ok, line 22 is testOptions in SequentialTest ++= Seq(Tests.Filter(sequentialTestFilter))
Partial answer: if I change ++= operator (appending to the sequence) to := (replacing the sequence) everything works fine (and := operators are used in the documentation). Even so I would want to know why ++= in this case is bad choice.