Apache Ignite integration with scala-spark using sbt - scala

I am trying to integrate ignite in scala code and run the application using sbt. I cannot use any IDE for this.
Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3
I have tried adding the basic library dependency in sbt.build,
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
my complete sbt.build code is,
scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
but it is still not working, I am getting the following error:
$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$34(CoursierDependencyResolution.scala:214)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2946)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM
I am working on Ubuntu:latest docker image.
I am sure that spark and ignite are working as I am also running pyspark and it is working just fine. Please help me out, I think being a newbie I am doing some minor mistake which is becoming a major issue here.

It seems that he tries to pool "ignite_spark_2.11", not only "ignite_spark".
"not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom"
use
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"
instead of
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
%% -> get dependencies with scala version appended
% -> get dependencies without scala version appended

Related

intellij: errors with scala, spark

I have a scala project using spark libraies, and it works fine most of the times(using intellij). But some times it starts giving errors on intellij launch:
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] Total time: 1 s, completed 17 Sep 2022, 15:13:49
[info] shutting down sbt server
build.sbt is:
/*ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.8"
lazy val root = (project in file("."))
.settings(
name := "spark-learning"
)*/
// Name of the package
name := "spark-learning"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.0-preview2",
"org.apache.spark" %% "spark-sql" % "3.0.0-preview2"
)
What causes these issues out of sudden?And how can I get rid?
One weird thing in the error messages is the path where it looks for dependencies: ...ivy2\localorg.apache.spark\....
There should be a \ after local.
Any chance your SBT repositories configuration could be messed up? Not sure where it is on Windows, it's /etc/sbt/repositories typically on Linux but maybe Intellij has its own setting.

How do I add an SBT plugin resolver on CircleCI?

I am trying to automate CI/CD of a small Scala project using CircleCI. The project is built using sbt, and tested using the ScalaTest library.
As per the ScalaTest installation instruction's recommendation, I am using the SuperSafe compiler plugin, which required me to add a resolver to global file ~/.sbt/1.0/global.sbt:
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
I can successfully compile and test my project locally. however on CircleCI the build fails with error:
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.artima.supersafe#supersafe_2.12.8;1.1.7: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] com.artima.supersafe:supersafe_2.12.8:1.1.7 (Defaults.scala#L3331)
[warn] +- filesystem:filesystem_2.12:0.1
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.artima.supersafe#supersafe_2.12.8;1.1.7: not found
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
[error] at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:239)
[error] at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] at xsbt.boot.Using$.withResource(Using.scala:10)
[error] at xsbt.boot.Using$.apply(Using.scala:9)
[error] at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:238)
[error] at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:45)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:93)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:106)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:106)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:89)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:120)
[error] at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2561)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:40)
[error] at sbt.std.Transform$$anon$4.work(System.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:269)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:278)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:269)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: com.artima.supersafe#supersafe_2.12.8;1.1.7: not found
This error is to be expected when the required resolver hasn't been added in SBT -- see e.g. this issue. I am new to CircleCI and don't know where its SBT globals dir would be located, or how to modify the file.
Instead I've tried to add the resolver to my project's ./project/plugins.sbt file directly, but this has not fixed the issue.
The SBT and CircleCI config files look as follows:
./build.sbt
name := "my-project-name"
version := "0.1"
scalaVersion := "2.12.8"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.8"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % "test"
./project/build.properties
sbt.version = 1.2.8
./project/plugins.sbt
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
addSbtPlugin("com.artima.supersafe" % "sbtplugin" % "1.1.7")
./.circleci/config.yml
(the default Scala config provided by CircleCI)
# Scala CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/sample-config/ for more details
#
version: 2
jobs:
build:
docker:
# specify the version you desire here
- image: circleci/openjdk:8-jdk
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/postgres:9.4
working_directory: ~/repo
environment:
# Customize the JVM maximum heap limit
JVM_OPTS: -Xmx3200m
TERM: dumb
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "build.sbt" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: cat /dev/null | sbt test:compile
- save_cache:
paths:
- ~/.m2
key: v1-dependencies--{{ checksum "build.sbt" }}
# run tests!
- run: cat /dev/null | sbt test:test
I want a successful CircleCI build, which will require a way to add the resolver, either in the project's own files or declaring it in the SBT globals file of the CircleCI container.
Consider open issue SBT isn't using resolvers defined in project/plugins.sbt #4103. Try scoping the resolver to ThisBuild and put it in both build.sbt and plugins.sbt like so:
// someApp/build.sbt
resolvers in ThisBuild += "Artima Maven Repository" at "http://repo.artima.com/releases"
and
// someApp/project/plugins.sbt
resolvers in ThisBuild += "Artima Maven Repository" at "http://repo.artima.com/releases"
addSbtPlugin("com.artima.supersafe" % "sbtplugin" % "1.1.7")
This seems to have worked on this example repo

Calling clojure code from a scala sbt task

I am trying to call some clojure code as a sbt task.
My build.sbt looks like,
lazy val aTask = taskKey[Unit]("a task")
libraryDependencies ++= Seq(
"org.clojure" % "clojure" % "1.9.0"
)
import clojure.java.api.Clojure
import clojure.lang.IFn
aTask := {
val plus: IFn = Clojure.`var`("clojure.core", "+")
println(plus.invoke(1, 4))
}
Contents of project/build.sbt
resolvers += Resolver.mavenLocal
libraryDependencies ++= Seq(
"org.clojure" % "clojure" % "1.9.0"
)
Also I have added clojure dep in project/build.sbt of my project.
I am getting the following error when calling the task
[error] java.lang.ExceptionInInitializerError
[error] at clojure.lang.Namespace.<init>(Namespace.java:34)
[error] at clojure.lang.Namespace.findOrCreate(Namespace.java:176)
[error] at clojure.lang.Var.intern(Var.java:148)
[error] at clojure.java.api.Clojure.var(Clojure.java:82)
[error] at clojure.java.api.Clojure.<clinit>(Clojure.java:96)
[error] at $2d5a9b65ddee7e6a09cc$.$anonfun$$sbtdef$1(build.sbt:20)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at sbt.std.Transform$$anon$3.$anonfun$apply$2(System.scala:46)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:271)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] Caused by: java.io.FileNotFoundException: Could not locate clojure/core__init.class or clojure/core.clj on classpath.
[error] at clojure.lang.RT.load(RT.java:463)
[error] at clojure.lang.RT.load(RT.java:426)
[error] at clojure.lang.RT.doInit(RT.java:468)
[error] at clojure.lang.RT.<clinit>(RT.java:336)
Any pointers on what I could try would be helpful.
I think there is an issue with class loaders setup by sbt. Clojure's RT class loads Clojure namespaces/classes using class loaders API. If sbt configures the classloaders hierarchy and class loading strategy (e.g. parent first) in a way that RT's classloader doesn't find Clojure's classes via classloader it's using then it will fail with the error you are getting.
Unfortunately, I don't know sbt internals to determine how the classloaders get configured. Maybe another question would help in the investigation: How to display classpath used for run task?

Embedded Cassandra with Spark and DataStax Driver Error

We are using following dependencies for our project.
val cassandraConnector = "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0"
val sparkSql = "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
val phantomDsl = "com.outworkers" % "phantom-dsl_2.11" % "2.15.5"
val cassandraUnit = "org.cassandraunit" % "cassandra-unit" % "3.3.0.2" % "test"
While running the test cases using the command sbt compile test.
We are getting following error:-
[error] java.lang.IncompatibleClassChangeError: Class com.datastax.driver.core.DefaultResultSetFuture does not implement the requested interface com.google.common.util.concurrent.ListenableFuture
[error] at com.google.common.util.concurrent.Futures.addCallback(Futures.java:1776)
[error] at com.outworkers.phantom.builder.query.execution.PromiseInterface$$anon$1.fromGuava(ExactlyOncePromise.scala:56)
[error] at com.outworkers.phantom.builder.query.execution.PromiseInterface$$anon$1.fromGuava(ExactlyOncePromise.scala:63)
[error] at com.outworkers.phantom.builder.query.execution.GuavaAdapter$class.fromGuava(ExecutableStatements.scala:44)
[error] at com.outworkers.phantom.builder.query.execution.PromiseInterface$$anon$1.fromGuava(ExactlyOncePromise.scala:39)
[error] at com.outworkers.phantom.builder.query.execution.ExecutableStatements$$anonfun$future$1.apply(ExecutableStatements.scala:112)
[error] at com.outworkers.phantom.builder.query.execution.ExecutableStatements$$anonfun$future$1.apply(ExecutableStatements.scala:110)
[error] at scala.collection.immutable.List.foreach(List.scala:381)
[error] at com.outworkers.phantom.builder.query.execution.ExecutableStatements.future(ExecutableStatements.scala:110)
[error] at com.outworkers.phantom.ops.DbOps.truncateAsync(DbOps.scala:104)
[error] at com.outworkers.phantom.ops.DbOps.truncate(DbOps.scala:94)
[error] at com.knoldus.cassandra.CassandraDatabaseCluster$class.afterAll(CassandraDatabaseCluster.scala:29)
[error] at com.knoldus.model.PredicateHashingSuite.afterAll(PredicateHashingSuite.scala:7)
[error] at org.scalatest.BeforeAndAfterAll$$anonfun$1.apply$mcV$sp(BeforeAndAfterAll.scala:225)
[error] at org.scalatest.Status$$anonfun$withAfterEffect$1.apply(Status.scala:379)
[error] at org.scalatest.Status$$anonfun$withAfterEffect$1.apply(Status.scala:375)
[error] at org.scalatest.FailedStatus$.whenCompleted(Status.scala:497)
[error] at org.scalatest.Status$class.withAfterEffect(Status.scala:375)
[error] at org.scalatest.FailedStatus$.withAfterEffect(Status.scala:469)
[error] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:223)
[error] at com.knoldus.model.PredicateHashingSuite.run(PredicateHashingSuite.scala:7)
[error] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[error] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[error] at sbt.TestRunner.runTest$1(TestFramework.scala:102)
[error] at sbt.TestRunner.run(TestFramework.scala:113)
[error] at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.$anonfun$apply$1(TestFramework.scala:258)
[error] at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:229)
[error] at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.apply(TestFramework.scala:258)
[error] at sbt.TestFramework$$anon$2$$anonfun$$lessinit$greater$1.apply(TestFramework.scala:258)
[error] at sbt.TestFunction.apply(TestFramework.scala:267)
[error] at sbt.Tests$.$anonfun$toTask$1(Tests.scala:276)
[error] at sbt.std.Transform$$anon$3.$anonfun$apply$2(System.scala:44)
[error] at sbt.std.Transform$$anon$4.work(System.scala:64)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:266)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
Is there something wrong within dependency?
How can we get rid of this error?
Thanks
That's really interesting, however it's not to do with phantom, it's to do with the Guava Library dependency that somehow ends up clashing with what is on the classpath.
In short, basically the problem comes from a binary incompatibility/exclusion, so Spark or something else in your classpath overrides the version of Guava in phantom.
If you print out evictions I can help you further.
We also offer phantom-sbt as an integrated SBT plugin for Cassandra, which I would suspect as the root cause here, as well as a Docker integration via SBT(pro only feature). Maybe file a GitHub issue for phantom, and we can follow up there more easily as soon as we know which versions clash.

Play Framework migration from 2.3 to 2.4

In plugins.sbt change
From: addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.6")
To: addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.6")
build.properties change
From: sbt.version=0.13.5
To: sbt.version=0.13.8
And have error:
[error] sbt.IncompatiblePluginsException: Binary incompatibility in plugins detected.
[error] Note that conflicts were resolved for some dependencies:
[error] org.apache.commons:commons-compress
[error] org.tukaani:xz
[error] org.codehaus.plexus:plexus-utils
[error] com.google.guava:guava
[error] org.codehaus.plexus:plexus-classworlds
[error] com.typesafe.akka:akka-actor_2.10
[error] com.typesafe:config
[error] org.slf4j:slf4j-api
[error] org.apache.commons:commons-lang3
[error] org.fusesource.leveldbjni:leveldbjni
[error] org.webjars:webjars-locator
[error] com.typesafe.sbt:sbt-web
[error] com.typesafe.sbt:sbt-js-engine
[error] Use 'last' for the full log.
Does, some know how fix that?
Problem was with play2-war-plugin plugin.
Just change version from: 1.3-beta1 to 1.4-beta1 and project was compile successfully