SBT testing tutorial leads to reflective access error - scala

I am trying to complete the sbt testing tutorial
I complete all steps and run sbt test. However I receive a nullpointer exception I do not know how to resolve.
C:\Users\Sayth\OneDrive\Projects\scalatest\scalatesttutorial>sbt test
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
[info] Loading project definition from C:\Users\Sayth\OneDrive\Projects\scalatest\scalatesttutorial\project
[info] Updating {file:/C:/Users/Sayth/OneDrive/Projects/scalatest/scalatesttutorial/project/}scalatesttutorial-build...
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by sbt.ivyint.ErrorMessageAuthenticator$ (file:/C:/Users/Sayth/.sbt/boot/scala-2.10.7/org.scala-sbt/sbt/0.13.17/ivy-0.13.17.jar) to field java.net.Authenticator.theAuthenticator
WARNING: Please consider reporting this to the maintainers of sbt.ivyint.ErrorMessageAuthenticator$
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
java.lang.NullPointerException
at scala.reflect.io.JavaToolsPlatformArchive.iterator(ZipArchive.scala:242)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.reflect.io.AbstractFile.foreach(AbstractFile.scala:92)
at scala.tools.nsc.util.DirectoryClassPath.traverse(ClassPath.scala:308)
at scala.tools.nsc.util.DirectoryClassPath.x$16$lzycompute(ClassPath.scala:317)
at scala.tools.nsc.util.DirectoryClassPath.x$16(ClassPath.scala:317)
at scala.tools.nsc.util.DirectoryClassPath.packages$lzycompute(ClassPath.scala:317)
at scala.tools.nsc.util.DirectoryClassPath.packages(ClassPath.scala:317)
at scala.tools.nsc.util.DirectoryClassPath.packages(ClassPath.scala:297)
at scala.tools.nsc.util.MergedClassPath$$anonfun$packages$1.apply(ClassPath.scala:375)
at scala.tools.nsc.util.MergedClassPath$$anonfun$packages$1.apply(ClassPath.scala:375)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.tools.nsc.util.MergedClassPath.packages$lzycompute(ClassPath.scala:375)
at scala.tools.nsc.util.MergedClassPath.packages(ClassPath.scala:370)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:243)
at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:194)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:240)
at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:59)
at scala.tools.nsc.Global.rootMirror(Global.scala:57)
at scala.tools.nsc.Global.rootMirror(Global.scala:37)
at scala.reflect.internal.Definitions$DefinitionsClass.<init>(Definitions.scala:166)
at scala.reflect.internal.Definitions$definitions$.<init>(Definitions.scala:20)
at scala.reflect.internal.SymbolTable.definitions$lzycompute(SymbolTable.scala:13)
at scala.reflect.internal.SymbolTable.definitions(SymbolTable.scala:13)
at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
at sbt.compiler.Eval$$anon$1.<init>(Eval.scala:141)
at sbt.compiler.Eval.run$lzycompute$1(Eval.scala:141)
at sbt.compiler.Eval.run$1(Eval.scala:141)
at sbt.compiler.Eval.unlinkAll$1(Eval.scala:144)
at sbt.compiler.Eval.evalCommon(Eval.scala:153)
at sbt.compiler.Eval.evalDefinitions(Eval.scala:122)
at sbt.EvaluateConfigurations$.evaluateDefinitions(EvaluateConfigurations.scala:271)
at sbt.EvaluateConfigurations$.evaluateSbtFile(EvaluateConfigurations.scala:109)
at sbt.Load$.sbt$Load$$loadSettingsFile$1(Load.scala:775)
at sbt.Load$$anonfun$sbt$Load$$memoLoadSettingsFile$1$1.apply(Load.scala:781)
at sbt.Load$$anonfun$sbt$Load$$memoLoadSettingsFile$1$1.apply(Load.scala:780)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap.getOrElse(Map.scala:58)
at sbt.Load$.sbt$Load$$memoLoadSettingsFile$1(Load.scala:780)
at sbt.Load$$anonfun$loadFiles$1$2.apply(Load.scala:788)
at sbt.Load$$anonfun$loadFiles$1$2.apply(Load.scala:788)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at sbt.Load$.loadFiles$1(Load.scala:788)
at sbt.Load$.discoverProjects(Load.scala:799)
at sbt.Load$.discover$1(Load.scala:585)
at sbt.Load$.sbt$Load$$loadTransitive(Load.scala:633)
at sbt.Load$$anonfun$loadUnit$1.sbt$Load$$anonfun$$loadProjects$1(Load.scala:482)
at sbt.Load$$anonfun$loadUnit$1$$anonfun$40.apply(Load.scala:485)
at sbt.Load$$anonfun$loadUnit$1$$anonfun$40.apply(Load.scala:485)
at sbt.Load$.timed(Load.scala:1025)
at sbt.Load$$anonfun$loadUnit$1.apply(Load.scala:485)
at sbt.Load$$anonfun$loadUnit$1.apply(Load.scala:459)
at sbt.Load$.timed(Load.scala:1025)
at sbt.Load$.loadUnit(Load.scala:459)
at sbt.Load$$anonfun$25$$anonfun$apply$14.apply(Load.scala:311)
at sbt.Load$$anonfun$25$$anonfun$apply$14.apply(Load.scala:310)
at sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:91)
at sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:90)
at sbt.BuildLoader.apply(BuildLoader.scala:140)
at sbt.Load$.loadAll(Load.scala:365)
at sbt.Load$.loadURI(Load.scala:320)
at sbt.Load$.load(Load.scala:316)
at sbt.Load$.load(Load.scala:305)
at sbt.Load$$anonfun$4.apply(Load.scala:146)
at sbt.Load$$anonfun$4.apply(Load.scala:146)
at sbt.Load$.timed(Load.scala:1025)
at sbt.Load$.apply(Load.scala:146)
at sbt.Load$.defaultLoad(Load.scala:39)
at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:503)
at sbt.BuiltinCommands$.doLoadProject(Main.scala:503)
at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:495)
at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:495)
at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
at sbt.Command$.process(Command.scala:93)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96)
at sbt.State$$anon$1.runCmd$1(State.scala:183)
at sbt.State$$anon$1.process(State.scala:187)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.MainLoop$.next(MainLoop.scala:96)
at sbt.MainLoop$.run(MainLoop.scala:89)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:68)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:63)
at sbt.Using.apply(Using.scala:24)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:63)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:46)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:30)
at sbt.MainLoop$.runLogged(MainLoop.scala:22)
at sbt.StandardMain$.runManaged(Main.scala:61)
at sbt.xMain.run(Main.scala:35)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:18)
at xsbt.boot.Boot$.runImpl(Boot.scala:56)
at xsbt.boot.Boot$.main(Boot.scala:18)
at xsbt.boot.Boot.main(Boot.scala)
[error] java.lang.NullPointerException
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

There are some compatibility issues between scala and the latest version of java. If you uninstall java 10, and then install java 8 instead, those errors will go away.
It's clearly indicated on the same website getting-started-with-scala-and-sbt-on-the-command-line
Make sure you have the Java 8 JDK (also known as 1.8)

Related

Spark error with spark-cassandra-connector

i'm trying to work with cassandra-mesos-spark and i would like to ask if someone can help me with this error, i used spark 2.2 try connector 1.6.11 and others but i cannot find out why i'm getting this
Environment:
spark-2.3.0-bin-hadoop2.7.tgz
datastax:spark-cassandra-connector:2.0.7-s_2.11
scala 11
Mesos cluster
Python application with pyspark
Code:
import sys
from pyspark import SparkContext, SparkConf
from pyspark.sql import SQLContext
sp_conf = SparkConf()
sp_conf.setAppName("spark_test")
sp_conf.setMaster("mesos://192.168.1.10:5050")
sp_conf.set("spark.local.dir", "/home/user/spark-temp")
sp_conf.set("spark.mesos.executor.home", "/home/user/spark")
sp_conf.set("spark.cassandra.connection.host", "192.168.1.51")
sp_conf.set("spark.jars.packages", "datastax:spark-cassandra-connector:2.0.7-s_2.11")
sp_conf.set("spark.mesos.coarse", "True")
sp_conf.set("spark.network.timeout","800")
sc = SparkContext(conf=sp_conf)
sqlContext = SQLContext(sc)
sys.stdout.write("\rGetting rows...")
sys.stdout.flush()
sqlContext.read\
.format("org.apache.spark.sql.cassandra")\
.options(table="opt_instruments", keyspace="fxinstrumentsdb")\
.load().show()
ERROR:
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found datastax#spark-cassandra-connector;2.0.7-s_2.11 in spark-packages
found com.twitter#jsr166e;1.1.0 in spark-list
found org.joda#joda-convert;1.2 in spark-list
found commons-beanutils#commons-beanutils;1.9.3 in central
found commons-collections#commons-collections;3.2.2 in spark-list
found joda-time#joda-time;2.3 in spark-list
found io.netty#netty-all;4.0.33.Final in central
found org.scala-lang#scala-reflect;2.11.8 in spark-list
:: resolution report :: resolve 1338ms :: artifacts dl 22ms
:: modules in use:
com.twitter#jsr166e;1.1.0 from spark-list in [default]
commons-beanutils#commons-beanutils;1.9.3 from central in [default]
commons-collections#commons-collections;3.2.2 from spark-list in [default]
datastax#spark-cassandra-connector;2.0.7-s_2.11 from spark-packages in [default]
io.netty#netty-all;4.0.33.Final from central in [default]
joda-time#joda-time;2.3 from spark-list in [default]
org.joda#joda-convert;1.2 from spark-list in [default]
org.scala-lang#scala-reflect;2.11.8 from spark-list in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 8 | 0 | 0 | 0 || 8 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 8 already retrieved (0kB/31ms)
2018-04-07 19:28:45 WARN Utils:66 - Your hostname, themachine resolves to a loopback address: 127.0.1.1; using 192.168.1.10 instead (on interface enp1s0)
2018-04-07 19:28:45 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-04-07 19:28:46 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-04-07 19:28:47 WARN SparkConf:66 - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
Warning: MESOS_NATIVE_LIBRARY is deprecated, use MESOS_NATIVE_JAVA_LIBRARY instead. Future releases will not support JNI bindings via MESOS_NATIVE_LIBRARY.
I0407 19:28:51.387593 5128 sched.cpp:232] Version: 1.5.0
I0407 19:28:51.436372 5120 sched.cpp:336] New master detected at master#192.168.1.10:5050
I0407 19:28:51.447155 5120 sched.cpp:351] No credentials provided. Attempting to register without authentication
I0407 19:28:51.464504 5119 sched.cpp:751] Framework registered with 3c2a29b3-d69f-4982-802e-88342d5c42fd-0038
[Stage 0:> (0 + 1) / 1]2018-04-07 19:30:56 WARN TaskSetManager:66 - Lost task 0.0 in stage 0.0 (TID 0, 10.8.0.6, executor 1): java.io.IOException: Exception during preparation of SELECT "tradeunitsprecision", "minimumtrailingstopdistance", "displayprecision", "maximumtrailingstopdistance", "marginrate", "piplocation", "name", "type", "minimumtradesize", "displayname", "maximumpositionsize", "maximumorderunits" FROM "fxinstrumentsdb"."opt_instruments" WHERE token("name") > ? AND token("name") <= ? ALLOW FILTERING: org/apache/spark/sql/catalyst/package$ScalaReflectionLock$
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:323)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:339)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:367)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:367)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:253)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:247)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/catalyst/package$ScalaReflectionLock$
at org.apache.spark.sql.catalyst.ReflectionLock$.<init>(ReflectionLock.scala:5)
at org.apache.spark.sql.catalyst.ReflectionLock$.<clinit>(ReflectionLock.scala)
at com.datastax.spark.connector.types.TypeConverter$.<init>(TypeConverter.scala:75)
at com.datastax.spark.connector.types.TypeConverter$.<clinit>(TypeConverter.scala)
at com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
at com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
at com.datastax.spark.connector.types.ColumnType$.converterToCassandra(ColumnType.scala:231)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$11.apply(CassandraTableScanRDD.scala:312)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$11.apply(CassandraTableScanRDD.scala:312)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:312)
... 26 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.catalyst.package$ScalaReflectionLock$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 44 more
My first guess is that you did not package in the spark sql dependency
Hi #panosd seems the connector doesnt work with 2.3.0 yet, there is an open pr here. +1 as well so that it can be brought to attention and merged asap.
I'm using HDFS and everything is better and faster, after that I can save the whole calculations to cassandra db for another part.

Scala Neo4j OGM - Will not scan annotated NodeEntity classes - IllegalArgumentException

Problem: Annotated classes are not being loaded into the class map.
My understanding on how to use the OGM:
Pass your domain (which includes your NodeEntity annotated classes) to the SessionFactory
Open a Session. At this point, behind the scenes, a class map of the annotated NodeEntity classes is built. This is referenced in the stacktrace on line 3 as "Building annotation class map". You'll see it mapped 0 on the next line.
Pass an instance of an annotated NodeEntity class to session.save
Your entity is now saved in Neo4j!
In this sample project, the domain is testdomain, and the only annotated NodeEntity, Person, is defined in the same file. I would expect this would eliminate any issues finding what classes to build into the persistent class map behind the scenes. But as seen in the log, the class map is not considering the domain I pass to SessionFactory.
This prevents any data from being saved to Neo.
You can clone the project and run the app yourself to reproduce the problem (instructions in README).
Stacktrace:
[info] Running testdomain.Test
2018-03-08 13:54:09 INFO DomainInfo:160 - Starting Post-processing phase
2018-03-08 13:54:09 INFO DomainInfo:126 - Building annotation class map
2018-03-08 13:54:09 INFO DomainInfo:139 - Building interface class map for 0 classes
2018-03-08 13:54:09 INFO DomainInfo:215 - Post-processing complete
[error] (run-main-0) java.lang.IllegalArgumentException: Class class testdomain.Person is not a valid entity class. Please check the entity mapping.
[error] java.lang.IllegalArgumentException: Class class testdomain.Person is not a valid entity class. Please check the entity mapping.
[error] at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:88)
[error] at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:40)
[error] at org.neo4j.ogm.session.Neo4jSession.save(Neo4jSession.java:469)
[error] at testdomain.Test$.delayedEndpoint$testdomain$Test$1(Test.scala:32)
[error] at testdomain.Test$delayedInit$body.apply(Test.scala:26)
[error] at scala.Function0.apply$mcV$sp(Function0.scala:34)
[error] at scala.Function0.apply$mcV$sp$(Function0.scala:34)
[error] at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
[error] at scala.App.$anonfun$main$1$adapted(App.scala:76)
[error] at scala.collection.immutable.List.foreach(List.scala:389)
[error] at scala.App.main(App.scala:76)
[error] at scala.App.main$(App.scala:74)
[error] at testdomain.Test$.main(Test.scala:26)
[error] at testdomain.Test.main(Test.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
[error] at java.lang.reflect.Method.invoke(Unknown Source)
[error] at sbt.Run.invokeMain(Run.scala:93)
[error] at sbt.Run.run0(Run.scala:87)
[error] at sbt.Run.execute$1(Run.scala:65)
[error] at sbt.Run.$anonfun$run$4(Run.scala:77)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at sbt.util.InterfaceUtil$$anon$1.get(InterfaceUtil.scala:10)
[error] at sbt.TrapExit$App.run(TrapExit.scala:252)
[error] at java.lang.Thread.run(Unknown Source)
[error] java.lang.RuntimeException: Nonzero exit code: 1
[error] at sbt.Run$.executeTrapExit(Run.scala:124)
[error] at sbt.Run.run(Run.scala:77)
[error] at sbt.Defaults$.$anonfun$bgRunTask$5(Defaults.scala:1169)
[error] at sbt.Defaults$.$anonfun$bgRunTask$5$adapted(Defaults.scala:1164)
[error] at sbt.internal.BackgroundThreadPool.$anonfun$run$1(DefaultBackgroundJobService.scala:366)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at scala.util.Try$.apply(Try.scala:209)
[error] at sbt.internal.BackgroundThreadPool$BackgroundRunnable.run(DefaultBackgroundJobService.scala:289)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
[error] at java.lang.Thread.run(Unknown Source)
[error] (Compile / run) Nonzero exit code: 1
[error] Total time: 6 s, completed Mar 8, 2018 1:54:09 PM
I have searched high and low to try to resolve this issue. However, I am new at scala, so this must be something simple I am missing, right?
-skyfer
This issue has been resolved with a modification to the Neo4j OGM library. The details are in this issue on their github.

How to get rid of 'key not found: source' error and blocking deploying in Play framework 2.4.4

In the play 2.4.4 framework I get at a seemingly random time the error 'key not found: SOURCE' at deployment time. After this happens once there is no way to use that development environment again. I have to go back to a previously saved version of the project and try again. If I make the same code changes (for example something simple like extend a table in a Play HTML page) the same might happen or not. (I use Intellij Idea version 15 Ultimate)
After some research this error message seems to have to do with the generation of the *.template.scala file for the html pages of the play framework.
Suggested older remedies talk about using the the 'play clean update' command but nowadays there seems to be only the Activate and I have not found a way to let id do cleaning and updating.
Any idea of why this is happening almost every 2 or 3 deployments ? What can I do to reset the situation ? Any suggestions are greatly appreciated.
Stack dump follows for information:
--- (Running the application, auto-reloading is enabled) ---
[info] p.c.s.NettyServer - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
Server started, use Alt+D to stop
[info] Compiling 39 Scala sources and 1 Java source to E:\source\scalaIntelliJ\auctioneer\target\scala-2.11\classes...
java.util.NoSuchElementException: key not found: SOURCE
at scala.collection.MapLike$class.default(MapLike.scala:228)
at scala.collection.AbstractMap.default(Map.scala:58)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at scala.collection.AbstractMap.apply(Map.scala:58)
at play.twirl.compiler.GeneratedSource.source(TwirlCompiler.scala:129)
at play.twirl.compiler.GeneratedSource.sync(TwirlCompiler.scala:138)
at play.twirl.sbt.TemplateCompiler$$anonfun$syncGenerated$2.apply(TemplateCompiler.scala:38)
at play.twirl.sbt.TemplateCompiler$$anonfun$syncGenerated$2.apply(TemplateCompiler.scala:38)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at play.twirl.sbt.TemplateCompiler$.syncGenerated(TemplateCompiler.scala:38)
at play.twirl.sbt.TemplateCompiler$.compile(TemplateCompiler.scala:23)
at play.twirl.sbt.SbtTwirl$$anonfun$compileTemplatesTask$1.apply(SbtTwirl.scala:87)
at play.twirl.sbt.SbtTwirl$$anonfun$compileTemplatesTask$1.apply(SbtTwirl.scala:86)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (compile:twirlCompileTemplates) java.util.NoSuchElementException: key not found: SOURCE
[error] application -
! #6po0l44eg - Internal server error, for (GET) [/] ->
play.sbt.PlayExceptions$UnexpectedException: Unexpected exception[NoSuchElementException: key not found: SOURCE]
at play.sbt.run.PlayReload$$anonfun$taskFailureHandler$1.apply(PlayReload.scala:51) ~[na:na]
at play.sbt.run.PlayReload$$anonfun$taskFailureHandler$1.apply(PlayReload.scala:44) ~[na:na]
at scala.Option.map(Option.scala:145) ~[scala-library-2.11.7.jar:na]
at play.sbt.run.PlayReload$.taskFailureHandler(PlayReload.scala:44) ~[na:na]
at play.sbt.run.PlayReload$.compileFailure(PlayReload.scala:40) ~[na:na]
at play.sbt.run.PlayReload$$anonfun$compile$2$$anonfun$apply$3.apply(PlayReload.scala:20) ~[na:na]
at play.sbt.run.PlayReload$$anonfun$compile$2$$anonfun$apply$3.apply(PlayReload.scala:20) ~[na:na]
at scala.util.Either$LeftProjection.map(Either.scala:377) ~[scala-library-2.11.7.jar:na]
at play.sbt.run.PlayReload$$anonfun$compile$2.apply(PlayReload.scala:20) ~[na:na]
at play.sbt.run.PlayReload$$anonfun$compile$2.apply(PlayReload.scala:18) ~[na:na]
Caused by: java.util.NoSuchElementException: key not found: SOURCE
at scala.collection.MapLike$class.default(MapLike.scala:228) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractMap.default(Map.scala:58) ~[scala-library-2.11.7.jar:na]
at scala.collection.MapLike$class.apply(MapLike.scala:141) ~[scala-library-2.11.7.jar:na]
at scala.collection.AbstractMap.apply(Map.scala:58) ~[scala-library-2.11.7.jar:na]
at play.twirl.compiler.GeneratedSource.source(TwirlCompiler.scala:129) ~[na:na]
at play.twirl.compiler.GeneratedSource.sync(TwirlCompiler.scala:138) ~[na:na]
at play.twirl.sbt.TemplateCompiler$$anonfun$syncGenerated$2.apply(TemplateCompiler.scala:38) ~[na:na]
at play.twirl.sbt.TemplateCompiler$$anonfun$syncGenerated$2.apply(TemplateCompiler.scala:38) ~[na:na]
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) ~[scala-library-2.11.7.jar:na]
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) ~[scala-library-2.11.7.jar:na]

Duplicate mappings when building docker package using sbt-native-packager

I'm using sbt-native-packager to build a docker image of our Akka HTTP based application in Scala. However, recently it has started throwing the following error when running the sbt docker:publishLocal command:
[info] Loading project definition from ~/directory/project
[info] Set current project to fortytwo-api (in build file:~/directory/)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[info] Wrote ~/directory/target/scala-2.11/fortytwo-api_2.11-1.0.pom
java.lang.RuntimeException: Duplicate mappings:
~/directory/target/docker/stage/opt/docker/lib/org.scalaz.scalaz-core_2.11-7.1.0.jar
from
~/.ivy2/maven-cache/org/scalaz/scalaz-core_2.11/7.1.0/scalaz-core_2.11-7.1.0.jar
~/.ivy2/cache/org.scalaz/scalaz-core_2.11/bundles/scalaz-core_2.11-7.1.0.jar
~/directory/target/docker/stage/opt/docker/lib/com.typesafe.config-1.2.1.jar
from
~/.ivy2/maven-cache/com/typesafe/config/1.2.1/config-1.2.1.jar
~/.ivy2/cache/com.typesafe/config/bundles/config-1.2.1.jar
~/directory/target/docker/stage/opt/docker/lib/com.google.protobuf.protobuf-java-2.5.0.jar
from
~/.ivy2/maven-cache/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
~/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
~/directory/target/docker/stage/opt/docker/lib/org.fusesource.leveldbjni.leveldbjni-all-1.7.jar
from
~/.ivy2/maven-cache/org/fusesource/leveldbjni/leveldbjni-all/1.7/leveldbjni-all-1.7.jar
~/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.7.jar
at scala.sys.package$.error(package.scala:27)
at sbt.Sync$.noDuplicateTargets(Sync.scala:67)
at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:25)
at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:22)
at com.typesafe.sbt.packager.Stager$.stageFiles(Stager.scala:26)
at com.typesafe.sbt.packager.Stager$.stage(Stager.scala:40)
at com.typesafe.sbt.packager.docker.DockerPlugin$$anonfun$projectSettings$17.apply(DockerPlugin.scala:117)
at com.typesafe.sbt.packager.docker.DockerPlugin$$anonfun$projectSettings$17.apply(DockerPlugin.scala:117)
at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:35)
at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:34)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (docker:stage) Duplicate mappings:
[error] ~/directory/target/docker/stage/opt/docker/lib/org.scalaz.scalaz-core_2.11-7.1.0.jar
[error] from
[error] ~/.ivy2/maven-cache/org/scalaz/scalaz-core_2.11/7.1.0/scalaz-core_2.11-7.1.0.jar
[error] ~/.ivy2/cache/org.scalaz/scalaz-core_2.11/bundles/scalaz-core_2.11-7.1.0.jar
[error] ~/directory/target/docker/stage/opt/docker/lib/com.typesafe.config-1.2.1.jar
[error] from
[error] ~/.ivy2/maven-cache/com/typesafe/config/1.2.1/config-1.2.1.jar
[error] ~/.ivy2/cache/com.typesafe/config/bundles/config-1.2.1.jar
[error] ~/directory/target/docker/stage/opt/docker/lib/com.google.protobuf.protobuf-java-2.5.0.jar
[error] from
[error] ~/.ivy2/maven-cache/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
[error] ~/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
[error] ~/directory/target/docker/stage/opt/docker/lib/org.fusesource.leveldbjni.leveldbjni-all-1.7.jar
[error] from
[error] ~/.ivy2/maven-cache/org/fusesource/leveldbjni/leveldbjni-all/1.7/leveldbjni-all-1.7.jar
[error] ~/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.7.jar
Seems like the relatively new SBT feature, this is causing the problem.
Add this to your build:
updateOptions := updateOptions.value.withCachedResolution(false)
solved it for me.

Error while installing PlayFramework

H Friends,
the command "activator ui" it downloaded a lot of files finally it throws some exception. PFB of the error
Error Message:
======================================================
https://repo.typesafe.com/typesafe/releases/com/typesafe/sbt/client-all-2-11/0.3.5/client-all-2-11-0.3.5.jar ..
[SUCCESSFUL ] com.typesafe.sbt#client-all-2-11;0.3.5!client-all-2-11.jar (10156ms)
:: retrieving :: org.scala-sbt#boot-app
confs: [default]
91 artifacts copied, 0 already retrieved (79872kB/287ms)
Local repository: activator-launcher-local # file:////C:/Users/xxx/activator-1.3.6-minimal/repository
Play server process ID is 6124
[info] play - Application started (Prod)
Oops, cannot start the server.
org.jboss.netty.channel.ChannelException: Failed to bind to: /127.0.0.1:8888
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at play.core.server.NettyServer$$anonfun$8.apply(NettyServer.scala:138)
at play.core.server.NettyServer$$anonfun$8.apply(NettyServer.scala:135)
at scala.Option.map(Option.scala:146)
at play.core.server.NettyServer.<init>(NettyServer.scala:135)
at play.core.server.NettyServer$.createServer(NettyServer.scala:252)
at play.core.server.NettyServer$$anonfun$main$3.apply(NettyServer.scala:289)
at play.core.server.NettyServer$$anonfun$main$3.apply(NettyServer.scala:284)
at scala.Option.map(Option.scala:146)
at play.core.server.NettyServer$.main(NettyServer.scala:284)
at activator.UIMain$$anonfun$run$1.apply$mcV$sp(UIMain.scala:106)
at activator.UIMain$$anonfun$run$1.apply(UIMain.scala:106)
at activator.UIMain$$anonfun$run$1.apply(UIMain.scala:106)
at activator.UIMain.withContextClassloader(UIMain.scala:217)
at activator.UIMain.run(UIMain.scala:106)
at activator.UIMain.run(UIMain.scala:86)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:18)
at xsbt.boot.Boot$.runImpl(Boot.scala:41)
at xsbt.boot.Boot$.main(Boot.scala:17)
at xsbt.boot.Boot.main(Boot.scala)
Caused by: java.net.BindException: Address already in use: bind
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Unknown Source)
at sun.nio.ch.Net.bind(Unknown Source)
at sun.nio.ch.ServerSocketChannelImpl.bind(Unknown Source)
at sun.nio.ch.ServerSocketAdaptor.bind(Unknown Source)
at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:391)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:315)
at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
I checked the previous questions but they are not resembling my issue
You have a process that is already using that port. You need to kill that process to free it:
On a mac:
lsof -i tcp:3000 to find the process. Then,
kill -9 <PID> to kill the process
On windows:
netstat -a -n -o | find "<PORT>" to find the process. Then,
taskkill /pid <PID>