I'm quite new to Scala, Akka and SBT, so this issue is giving me some headache! I'm working with the book Akka in Action, where the author provides examples on Github:
https://github.com/RayRoestenburg/akka-in-action
On a clean clone of the repository, i was trying to have a look at the example in chapter-cluster.
As described in the book, the first thing to do is to start a seed node on the local machine with:
sbt -DHOST=127.0.0.1 -DPORT=2551
I was trying to install another Scala version, run it with root privileges, but without success. In my understanding SBT should take care of all the versioning stuff and download the specified packages.
Error messages goes like that:
[info] Loading global plugins from /home/sfink/.sbt/0.13/plugins
[info] Loading project definition from /home/sfink/IdeaProjects/akka-in-action/chapter-cluster/project
[info] Updating {file:/home/sfink/IdeaProjects/akka-in-action/chapter-cluster/project/}chapter-cluster-build...
[info] Resolving org.scala-sbt#compiler-interface;0.13.1 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.scala-lang#scala-library;2.10.4: configuration not found in org.scala-lang#scala-library;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
[warn] :: org.scala-lang#scala-reflect;2.10.4: configuration not found in org.scala-lang#scala-reflect;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.10.4: configuration not found in org.scala-lang#scala-library;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
unresolved dependency: org.scala-lang#scala-reflect;2.10.4: configuration not found in org.scala-lang#scala-reflect;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:213)
at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:122)
at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:121)
at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.withFileLock(Locks.scala:102)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:38)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
at sbt.IvySbt.withIvy(Ivy.scala:101)
at sbt.IvySbt.withIvy(Ivy.scala:97)
at sbt.IvySbt$Module.withModule(Ivy.scala:116)
at sbt.IvyActions$.update(IvyActions.scala:121)
at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1161)
at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1159)
at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1182)
at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1180)
at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1184)
at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1179)
at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
at sbt.Classpaths$.cachedUpdate(Defaults.scala:1187)
at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1152)
at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1130)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.10.4: configuration not found in org.scala-lang#scala-library;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
[error] unresolved dependency: org.scala-lang#scala-reflect;2.10.4: configuration not found in org.scala-lang#scala-reflect;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.scalactic#scalactic_2.10;2.2.1 compile
Update 1:
This is the content of the build.sbt file:
name := "words-cluster"
version := "1.0"
scalaVersion := "2.11.7"
organization := "com.manning"
libraryDependencies ++= {
val akkaVersion = "2.4.14"
Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"com.typesafe.akka" %% "akka-remote" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-multi-node-testkit" % akkaVersion % "test",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"org.scalatest" %% "scalatest" % "3.0.0" % "test",
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"ch.qos.logback" % "logback-classic" % "1.0.10"
)
}
// Assembly settings
mainClass in Global := Some("aia.cluster.words.Main")
assemblyJarName in assembly := "words-node.jar"
Any idea what is going wrong?
Try this as your build.sbt:
name := "words-cluster"
version := "1.0"
scalaVersion := "2.11.8"
organization := "com.manning"
libraryDependencies ++= {
val akkaVersion = "2.4.14"
Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"com.typesafe.akka" %% "akka-remote" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-multi-node-testkit" % akkaVersion % "test",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"org.scalatest" %% "scalatest" % "3.0.1" % "test" exclude("org.scala-lang.modules", "scala-xml_2.11"),
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"ch.qos.logback" % "logback-classic" % "1.1.7"
)
}
// Assembly settings
mainClass in Global := Some("aia.cluster.words.Main")
assemblyJarName in assembly := "words-node.jar"
I updated the scala, logging and scalatest versions
Seems like the sbt version i was using was the wrong one! After upgrading to sbt-0.13.15 compilation was running properly.
Thanks for helping!
Related
I am new to Scala and SBT. I am using Kafka streaming and storing the data to Cassandra DB. while trying to take fat jar using sbt assembly command, I am getting below mentioned error.
how to resolve this issue ? and take fat jar
build.sbt
organization := "com.example"
name := "cass-conn"
version := "0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.2.0"
val connectorVersion = "2.0.7"
val kafka_stream_version = "1.6.3"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % connectorVersion ,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0",
"org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided",
)
plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
SBT version : 1.0.3
Error
[error] 1 error was encountered during merge
[error] java.lang.RuntimeException: deduplicate: different file contents found in the following:
[error] C:\Users\gnana\.ivy2\cache\org.apache.spark\spark-streaming-kafka-0-10_2.11\jars\spark-streaming-kafka-0-10_2.11-2.2.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] C:\Users\gnana\.ivy2\cache\org.apache.spark\spark-tags_2.11\jars\spark-tags_2.11-2.2.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] C:\Users\gnana\.ivy2\cache\org.spark-project.spark\unused\jars\unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] at sbtassembly.Assembly$.applyStrategies(Assembly.scala:141)
[error] at sbtassembly.Assembly$.x$1$lzycompute$1(Assembly.scala:25)
[error] at sbtassembly.Assembly$.x$1$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.stratMapping$lzycompute$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.stratMapping$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.inputs$lzycompute$1(Assembly.scala:67)
[error] at sbtassembly.Assembly$.inputs$1(Assembly.scala:57)
[error] at sbtassembly.Assembly$.apply(Assembly.scala:84)
[error] at sbtassembly.Assembly$.$anonfun$assemblyTask$1(Assembly.scala:249)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
[error] at sbt.std.Transform$$anon$4.work(System.scala:64)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:266)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] C:\Users\gnana\.ivy2\cache\org.apache.spark\spark-streaming-kafka-0-10_2.11\jars\spark-streaming-kafka-0-10_2.11-2.2.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] C:\Users\gnana\.ivy2\cache\org.apache.spark\spark-tags_2.11\jars\spark-tags_2.11-2.2.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] C:\Users\gnana\.ivy2\cache\org.spark-project.spark\unused\jars\unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] Total time: 91 s, completed Mar 11, 2018 6:15:45 PM
You need to write a merge strategy in your SBT file which will help SBT pick the right UnusedStubClass.class for you
organization := "com.example"
name := "cass-conn"
version := "0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.2.0"
val connectorVersion = "2.0.7"
val kafka_stream_version = "1.6.3"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % connectorVersion ,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0",
"org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided",
)
mergeStrategy in assembly := {
case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
case x => (mergeStrategy in assembly).value(x)
}
Check with your Java version,I had same issue with higher Java versions and later I downgraded to Java8 to fix it
I am using CDH cluster with Spark 2.1 with Scala 2.11.8.
I use sbt 1.0.2.
While doing assembly, I am getting error as
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-parser-combinators
I tried to override the version mismatch using dependencyOverrides and force(), but neither worked.
Error message from sbt assembly
[error] Modules were resolved with conflicting cross-version suffixes in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp
arkTest/}newsparktest:
[error] org.scala-lang.modules:scala-xml _2.11, _2.12
[error] org.scala-lang.modules:scala-parser-combinators _2.11, _2.12
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.
modules:scala-parser-combinators
[error] at scala.sys.package$.error(package.scala:27)
[error] at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error] at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error] at sbt.Classpaths$.$anonfun$ivyBaseSettings$64(Defaults.scala:1971)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
[error] at sbt.std.Transform$$anon$4.work(System.scala:64)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:266)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (*:update) Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-par
ser-combinators
[error] Total time: 413 s, completed Oct 12, 2017 3:28:02 AM
build.sbt
name := "newtest"
version := "0.0.2"
scalaVersion := "2.11.8"
sbtPlugin := true
val sparkVersion = "2.1.0"
mainClass in (Compile, run) := Some("com.testpackage.sq.newsparktest")
assemblyJarName in assembly := "newtest.jar"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.1.0" % "provided",
"org.apache.spark" % "spark-sql_2.11" % "2.1.0" % "provided",
"com.databricks" % "spark-avro_2.11" % "3.2.0",
"org.apache.spark" % "spark-hive_2.11" % "2.1.0" % "provided")
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" excludeAll(
ExclusionRule(organization = "com.sun.jdmk"),
ExclusionRule(organization = "com.sun.jmx"),
ExclusionRule(organization = "javax.jms")
)
resolvers += "SparkPackages" at "https://dl.bintray.com/spark-packages/maven/"
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
plugins.sbt
dependencyOverrides += ("org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4")
dependencyOverrides += ("org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
resolvers += Resolver.url("bintray-sbt-plugins", url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)
tl;dr Remove sbtPlugin := true from build.sbt (that is for sbt plugins not applications).
You should also remove dependencyOverrides from plugins.sbt.
You should change spark-core_2.11 and the other Spark dependencies in libraryDependencies to be as follows:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0" % "provided"
The change is to use %% (= two percent signs) and remove the version of Scala from the middle part of a dependency, e.g. spark-core above.
I need to plot my dataframe using WISP. I am using Apache Sprak in Scala. However it seems that it can not accept a dataframe.
+-----+--------------------+------------------+
|label| features| prediction|
+-----+--------------------+------------------+
| 101|[1.497846976E9,10...|101.22752534884378|
| 101|[1.497846976E9,10...|101.22752534884378|
| 101|[1.497846976E9,10...|101.22752534884378|
| 101|[1.497846976E9,10...|101.22752534884378|
| 101|[1.497846976E9,10...|101.22752534884378|
+-----+--------------------+------------------+
I also tried to change my datarame to Seq using the following code but, it does't work.
import org.apache.spark.sql.functions.{collect_list, collect_set}
val label=predictions.groupBy($"label").agg(collect_list($"label").alias("label"))
val predicted= predictions.groupBy($"prediction").agg(collect_list($"prediction").alias("prediction"))
line(predicted)
It shows me the following error:
Error:(157, 10) type mismatch;
found : org.apache.spark.sql.DataFrame
(which expands to) org.apache.spark.sql.Dataset[org.apache.spark.sql.Row]
required: com.quantifind.charts.repl.IterablePair[?,?,?,?]
line(predicted)
Is there any trick to plot a dataframe like below using WISP? Thanks in advance.
UPDATE:
According to answer number one, As I try to install vegas libraries, I got an error:
Error:Error while importing SBT project:<br/>...<br/><pre>[warn] ==== MapR Repository: tried
[warn] http://repository.mapr.com/maven/com/github/aishfenton/vegas-spark_2.10_2.11/0.2.0/vegas-spark_2.10_2.11-0.2.0.pom
[info] Resolving org.scala-lang#scala-compiler;2.11.8 ...
[info] Resolving org.scala-lang.modules#scala-xml_2.11;1.0.4 ...
[info] Resolving org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 ...
[info] Resolving jline#jline;2.12.1 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.github.aishfenton#vegas-spark_2.10_2.11;0.2.0: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] com.github.aishfenton:vegas-spark_2.10_2.11:0.2.0 (/Users/saeedtkh/Desktop/ML_Alpha/build.sbt#L15-34)
[warn] +- ml:ml_2.11:1.0
[trace] Stack trace suppressed: run 'last *:ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last *:update' for the full output.
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: com.github.aishfenton#vegas-spark_2.10_2.11;0.2.0: not found
[error] (*:update) sbt.ResolveException: unresolved dependency: com.github.aishfenton#vegas-spark_2.10_2.11;0.2.0: not found
[error] Total time: 27 s, completed Sep 22, 2017 1:06:23 PM</pre><br/>See complete log in file:/Users/saeedtkh/Library/Logs/IntelliJIdea2017.1/sbt.last.log
My bulid.sbt file is:
name := "ML"
version := "1.0"
scalaVersion := "2.11.8"
retrieveManaged := true
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "MapR Repository" at "http://repository.mapr.com/maven/"
libraryDependencies ++= Seq(
"co.theasi" %% "plotly" % "0.2.0",
"org.apache.commons" % "commons-csv" % "1.1",
"org.apache.spark" %% "spark-core" % "2.0.2",
"org.apache.spark" %% "spark-core" % "2.0.2",
"org.apache.spark" %% "spark-sql" % "2.0.2",
"org.apache.spark" %% "spark-hive" % "2.0.2",
"org.apache.spark" %% "spark-streaming" % "2.0.2",
"org.apache.spark" %% "spark-mllib" % "2.0.2",
"org.apache.spark" %% "spark-mllib" % "2.0.2",
"org.scalanlp" %% "breeze" % "0.11.2",
"org.scalanlp" %% "breeze-natives" % "0.11.2",
"org.scalanlp" %% "breeze-viz" % "0.11.2",
"com.quantifind" %% "wisp" % "0.0.4"
// https://mvnrepository.com/artifact/com.github.aishfenton/vegas-spark_2.10
//"com.github.aishfenton" %% "vegas-spark_2.10" % "0.2.0"
)
What exactly are you trying to plot? Have you tried using Vegas? Vegas plotting library for Scala
An example for getting histograms of a dataframe column:
Add the following lines to your build file.
"org.vegas-viz" %% "vegas" % "0.3.9",
"org.vegas-viz" %% "vegas-spark" % "0.3.9"
An example code
import vegas._
import vegas.render.WindowRenderer._
import vegas.sparkExt._
val plot = Vegas("approval date").
withDataFrame(castedDf).
mark(Bar).
encodeX("columnName", Quant, bin=Bin(maxbins=20.0), sortOrder=SortOrder.Desc).
show
After adding
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
to build.sbt, and refreshed the project, I got this msg.
SBT project import
[warn] Multiple dependencies with the same organization/name but
different versions. To avoid conflict, pick one version:
[warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.5, 1.0.4)
Changing the above to
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.0" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml_2.11")
)
solves the issue. However, instead of excluding scala-xml_2.11 version 1.0.5 from scalatest, I would like to force the scala compiler to use scala-xml_2.11 version 1.0.5 instead of version 1.0.4. (I researched the versions at https://mvnrepository.com.) Thus I tried substituting scalaVersion := "2.11.8" for
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % "2.11.8",
"org.scala-lang.modules" % "scala-xml_2.11" % "1.0.5"
)
This however results in
SBT project import
[warn] Binary version (2.11) for dependency
org.scala-lang#scala-reflect;2.11.8
[warn] in default#myproject$sources_javadoc_2.10;1.0 differs
from Scala binary version in project (2.10).
[warn] Binary version (2.11) for dependency
org.scala-lang#scala-library;2.11.8
[warn] in default#myproject$sources_javadoc_2.10;1.0 differs
from Scala binary version in project (2.10).
[warn] Multiple dependencies with the same organization/name but
different versions. To avoid conflict, pick one version:
[warn] * org.scala-lang:scala-library:(2.11.8, 2.10.4)
[warn] * org.scala-lang:scala-reflect:(2.11.8, 2.10.4)
[warn] [FAILED ]
com.artima.supersafe#supersafe_2.10.4;1.1.0!supersafe_2.10.4.jar(src):
(0ms)
[warn] ==== local: tried
[warn]
/home/user/.ivy2/local/com.artima.supersafe/supersafe_2.10.4/1.1.0/srcs/supersafe_2.10.4-sources.jar
[warn] ==== activator-local: tried [warn] /Development/Activator/activator-dist-1.3.10/repository/com.artima.supersafe/supersafe_2.1...
(show balloon)
What am I supposed to do?
Edit: What else I tried and did not work:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang.modules" %% "scala-xml" % "1.0.5"
)
// ScalaTest
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
[warn] Multiple dependencies with the same organization/name but
different versions. To avoid conflict, pick one version: [warn] *
org.scala-lang.modules:scala-xml_2.11:(1.0.5, 1.0.4)
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang.modules" %% "scala-xml_2.11" % "1.0.5"
)
// ScalaTest
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
Error:Error while importing SBT project:...[info]
Resolving org.scala-sbt#run;0.13.8 ... [info] Resolving
org.scala-sbt#task-system;0.13.8 ... [info] Resolving
org.scala-sbt#tasks;0.13.8 ... [info] Resolving
org.scala-sbt#tracking;0.13.8 ... [info] Resolving
org.scala-sbt#cache;0.13.8 ... [info] Resolving
org.scala-sbt#testing;0.13.8 ... [info] Resolving
org.scala-sbt#test-agent;0.13.8 ... [info] Resolving
org.scala-sbt#test-interface;1.0 ... [info] Resolving
org.scala-sbt#main-settings;0.13.8 ... [info] Resolving
org.scala-sbt#apply-macro;0.13.8 ... [info] Resolving
org.scala-sbt#command;0.13.8 ... [info] Resolving
org.scala-sbt#logic;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_8_2;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_9_2;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_9_3;0.13.8 ... [trace] Stack trace
suppressed: run 'last *:update' for the full output. [trace] Stack
trace suppressed: run 'last :ssExtractDependencies' for the full
output. [error] (:update) sbt.ResolveException: unresolved
dependency: org.scala-lang.modules#scala-xml_2.11_2.11;1.0.5: not
found [error] (*:ssExtractDependencies) sbt.ResolveException:
unresolved dependency:
org.scala-lang.modules#scala-xml_2.11_2.11;1.0.5: not found [error]
Total time: 4 s, completed 01.10.2016 17:46:55
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang.modules" %% "scala-xml" % "1.0.5"
)
// ScalaTest
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.0" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml")
)
[warn] Multiple dependencies with the same organization/name but
different versions. To avoid conflict, pick one version: [warn] *
org.scala-lang.modules:scala-xml_2.11:(1.0.5, 1.0.4)
My build.sbt
name := "MyProject"
version := "0.1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang.modules" %% "scala-xml" % "1.0.5"
)
// ScalaTest
//libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.0"
//libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.0" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml_2.11")
)
The way to exclude specific transitive dependency is this:
Run sbt evicted to figure out which of the project dependencies is pulling in outdated library, let's assume the problematic library is: com.typesafe.slick.
Add the following exclude (the parentheses are important):
("com.typesafe" %% "slick" % "3.1.1").exclude("org.scala-lang.modules", "scala-xml_2.11")
Add this normally as you would be listing dependencies.
This will prevent sbt from including any version of scala-xml that was coming in as a transitive dependency of Slick.
Don't substitute; you need both scalaVersion and libraryDependencies.
Though use
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.scala-lang.modules" %% "scala-xml" % "1.0.5"
)
to avoid bugs when you eventually change scalaVersion.
By removing scalaVersion you get default scalaVersion := "2.10.4" (with your version/settings of SBT, at least) but your libraryDependencies still require 2.11.
I'm having a little problem with these template from TypeSafe, when I try to import it in IntelliJ I get following message error:
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: io.spray#sbt-revolver;0.7.2: not found
[warn] :: com.typesafe.sbt#sbt-aspectj;0.10.1: not found
[warn] :: com.typesafe.sbteclipse#sbteclipse-plugin;2.5.0: not found
[warn] :: com.timushev.sbt#sbt-updates;0.1.7: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
[warn] io.spray:sbt-revolver:0.7.2 (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.typesafe.sbt:sbt-aspectj:0.10.1 (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.typesafe.sbteclipse:sbteclipse-plugin:2.5.0 (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.timushev.sbt:sbt-updates:0.1.7 (sbtVersion=0.13, scalaVersion=2.10)
[warn]
[warn] Note: Unresolved dependencies path:
[warn] io.spray:sbt-revolver:0.7.2 (sbtVersion=0.13, scalaVersion=2.10) (C:\Users\jlopesde\akka-spray-websocket\project\plugins.sbt#L1-2)
[warn] +- default:akka-spray-websocket-build:0.1-SNAPSHOT (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.typesafe.sbt:sbt-aspectj:0.10.1 (sbtVersion=0.13, scalaVersion=2.10) (C:\Users\jlopesde\akka-spray-websocket\project\plugins.sbt#L3-4)
[warn] +- default:akka-spray-websocket-build:0.1-SNAPSHOT (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.typesafe.sbteclipse:sbteclipse-plugin:2.5.0 (sbtVersion=0.13, scalaVersion=2.10) (C:\Users\jlopesde\akka-spray-websocket\project\plugins.sbt#L5-6)
[warn] +- default:akka-spray-websocket-build:0.1-SNAPSHOT (sbtVersion=0.13, scalaVersion=2.10)
[warn] com.timushev.sbt:sbt-updates:0.1.7 (sbtVersion=0.13, scalaVersion=2.10) (C:\Users\jlopesde\akka-spray-websocket\project\plugins.sbt#L7-8)
[warn] +- default:akka-spray-websocket-build:0.1-SNAPSHOT (sbtVersion=0.13, scalaVersion=2.10)
sbt.ResolveException: unresolved dependency: io.spray#sbt-revolver;0.7.2: not found
unresolved dependency: com.typesafe.sbt#sbt-aspectj;0.10.1: not found
unresolved dependency: com.typesafe.sbteclipse#sbteclipse-plugin;2.5.0: not found
unresolved dependency: com.timushev.sbt#sbt-updates;0.1.7: not found
[error] (*:update) sbt.ResolveException: unresolved dependency: io.spray#sbt-revolver;0.7.2: not found
[error] unresolved dependency: com.typesafe.sbt#sbt-aspectj;0.10.1: not found
[error] unresolved dependency: com.typesafe.sbteclipse#sbteclipse-plugin;2.5.0: not found
[error] unresolved dependency: com.timushev.sbt#sbt-updates;0.1.7: not found
I know there is a problem on importing it, probably versions of Spray or Scala aren't correct but I don't know how to fix it. Could you provide me some tips for dealing with problems like that?
Thanks
My build.sbt:
organization := "cua.li"
version := "0.4"
scalaVersion := "2.11.5"
libraryDependencies ++= {
val akkaV = "2.3.9"
val sprayV = "1.3.2"
val kamonV = "0.3.5"
Seq(
"com.wandoulabs.akka" %% "spray-websocket" % "0.1.4" withSources() withJavadoc,
"io.spray" %% "spray-json" % "1.3.1" withSources() withJavadoc,
"io.spray" %% "spray-can" % sprayV withSources() withJavadoc,
"io.spray" %% "spray-routing" % sprayV withSources() withJavadoc,
"com.typesafe.akka" %% "akka-actor" % akkaV withSources() withJavadoc,
"com.typesafe.akka" %% "akka-slf4j" % akkaV withSources() withJavadoc,
/*
"org.aspectj" % "aspectjweaver" % "1.8.4" withSources() withJavadoc,
"io.kamon" %% "kamon-core" % kamonV withSources() withJavadoc,
"io.kamon" %% "kamon-spray" % kamonV withSources() withJavadoc,
"io.kamon" %% "kamon-statsd" % kamonV withSources() withJavadoc,
"io.kamon" %% "kamon-log-reporter" % kamonV withSources() withJavadoc,
"io.kamon" %% "kamon-system-metrics" % kamonV withSources() withJavadoc,
"io.kamon" %% "kamon-testkit" % kamonV % "test" withSources() withJavadoc,
// */
"com.typesafe.akka" %% "akka-testkit" % akkaV % "test" withSources() withJavadoc,
"io.spray" %% "spray-testkit" % sprayV % "test" withSources() withJavadoc,
"org.scalatest" %% "scalatest" % "2.2.3" % "test",
"junit" % "junit" % "4.12" % "test",
"org.specs2" %% "specs2" % "2.4.15" % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2"
)
}
scalacOptions ++= Seq("-deprecation", "-encoding", "UTF-8", "-feature", "-target:jvm-1.7", "-unchecked",
"-Ywarn-adapted-args", "-Ywarn-value-discard", "-Xlint")
javacOptions ++= Seq("-Xlint:deprecation", "-Xlint:unchecked", "-source", "1.7", "-target", "1.7", "-g:vars")
doc in Compile <<= target.map(_ / "none")
publishArtifact in (Compile, packageSrc) := false
logBuffered in Test := false
Keys.fork in Test := false
parallelExecution in Test := false
seq(Revolver.settings: _*)
import com.typesafe.sbt.SbtAspectj._
aspectjSettings
fork in run := true
javaOptions <++= AspectjKeys.weaverOptions in Aspectj
Probably missing the repositories to download the dependencies.
Try put this before libraryDependencies:
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/",
"Spray repository" at "http://repo.spray.io/",
"Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases"
)
I've had the same issue but only with sbt-revolver. I've managed to solve it by doing following:
My SBT is set up to use my Artifactory instance in order to retrieve dependencies.
1) In Artifactory I added the repository below to the "Remote Repositories" and made it part of the "remote-repos" virtual repository.
http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases
2) On my ~/.sbt/repositories file, I've made sure my Artifactory repository is configured with the correct URL format (please see below). I've taken the format from SBT's documentation.
artifactory-ivy: http://my.artifactory.com:8081/artifactory/repo/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]