Why does sbt give multiple dependencies warning with Akka and ScalaTest dependencies? - scala

I've just added ScalaTest to build.sbt so it now looks as follows:
name := "appname"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.1",
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
After that I am receiving the warning message:
SBT project import
[warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version:
[warn] * org.scala-lang:scala-reflect:(2.11.2, 2.11.7)
[warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.2, 1.0.4)
I also tried changing the line concerning ScalaTest into:
"org.scalatest" %% "scalatest" % "2.2.4" % "test"
but the warning still remains the same as above.
How could I deal with this issue since I haven't anywhere written "reflect" or "xml" in my project. I am using the newest version of both Akka and ScalaTest and Scala version 2.11.

The solution might be to add explicitly one of proposed versions by SBT. All warnings pass away when libraryDependencies is:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.1",
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.scala-lang" % "scala-reflect" % "2.11.7",
"org.scala-lang.modules" %% "scala-xml" % "1.0.4"
)

In your particular case this is related to the ISSUE 1933 and you can ignore it for now. You can also explicitly specify version of the dependency you need, to silence warnings.

The problem is fixed with sbt 0.13.12

Related

SBT: cannot resolve dependency that used to work before

My build.sbt looks like this:
import sbt._
name := "spark-jobs"
version := "0.1"
scalaVersion := "2.11.8"
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0" % "provided",
"org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.2.0"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
This used to work until I decided to see what happens if I add another % "provided" at the end of spark-streaming_2.11. It failed to resolve dependency, I moved on and reverted the change. But, it seems to give me the exception after that as well. Now my build.sbt looks exactly like it used to when everything worked. Still, it gives me this exception :
[error] (*:update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-streaming_2.11;2.2.0: org.apache.spark#spark-parent_2.11;2.2.0!spark-parent_2.11.pom(pom.original) origin location must be absolute: file:/home/aswin/.m2/repository/org/apache/spark/spark-parent_2.11/2.2.0/spark-parent_2.11-2.2.0.pom
SBT's behavior is a bit confusing to me. Could someone guide me to as why this could happen? Any good blogs/ resources to understand how exactly SBT works under the hood is also welcome.
Here is my project/assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")
project/build.properties:
sbt.version = 1.0.4
project/plugins.sbt:
resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
Thank you!
If you are in sbt console, just run reload command and try again. After you update your dependencies or sbt plugins, you need to reload the project so that the changes take effect.
By the way, instead of defining the Scala version in your dependencies, you can just use %% operator and it will fetch the appropriate dependency according to your defined scala version.
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.2.0",
"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
)

Issue importing Akka packages

I have the following build.sbt file:
name := "akkaHttp"
version := "1.0"
scalaVersion := "2.12.0"
resolvers ++= Seq("Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/",
Resolver.bintrayRepo("hseeberger", "maven"))
libraryDependencies ++= {
val AkkaVersion = "2.3.9"
val AkkaHttpVersion = "2.0.1"
val Json4sVersion = "3.2.11"
Seq(
"com.typesafe.akka" %% "akka-slf4j" % AkkaVersion,
"com.typesafe.akka" %% "akka-http-experimental" % AkkaHttpVersion,
"ch.qos.logback" % "logback-classic" % "1.1.2",
"org.json4s" %% "json4s-native" % Json4sVersion,
"org.json4s" %% "json4s-ext" % Json4sVersion,
"de.heikoseeberger" %% "akka-http-json4s" % "1.4.2"
)
}
In it, these dependencies are not met.
Error:Unresolved dependencies:
com.typesafe.akka#akka-slf4j_2.12;2.3.9: not found
com.typesafe.akka#akka-http-experimental_2.12;2.0.1: not found
org.json4s#json4s-native_2.12;3.2.11: not found
org.json4s#json4s-ext_2.12;3.2.11: not found
de.heikoseeberger#akka-http-json4s_2.12;1.4.2: not found
So what should I add to it such that the imports work?
Notice the _2.12 appended to the artifacts? Based on your scalaVersion, SBT tried to download the dependencies built against Scala 2.12.x, but couldn't find any. Try using Scala version 2.11.8.
Different versions of Scala can be binary incompatible, so libraries are cross built against those different versions and published to the repositories. Looks like it hasn't yet happened for those libraries above.
Note that Scala 2.12 is indeed not binary compatible with Scala 2.11.

Scala sbt: Multiple dependencies in sbt

I am a new user to Scala, following the way to create a scala sbt project.
https://www.youtube.com/watch?v=Ok7gYD1VbNw
After adding
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
to build.sbt, and refreshed the project, I got this msg.
[warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version:
[warn] * org.scala-lang:scala-reflect:(2.11.2, 2.11.7)
[warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.2, 1.0.4)
And in build.sbt, thw word 'scalatest' is red that means it's an unsolved dependencies.
Should I change something in Project Setting, like scala sdk?
Best Regard!
You could regard adding those dependencies:
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-reflect" % "2.11.7",
"org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4"
)
It forces compiler to choose concrete version of libraries. It solves problem for me.
I was able to resolve this by excluding these from the scalatest dependency.
libraryDependencies ++= Seq(
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml_2.11")
)

module not found: io.spray#sbt-revolver;0.7.2 using intellij 14.1.4 with sbt 0.13.5

I have tried many things but cant seem to get this to work.
I have the following in my build.sbt
name := "MyTestApp"
version := "0.1-SNAPSHOT"
scalaVersion := "2.10.3"
resolvers ++= Seq("spray repo" at "http://repo.spray.io",
"Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/",
"Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
"Plugin Releases" at
"http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases")
libraryDependencies ++= Seq(
"Io.spray" %% "spray-routing" % "1.3.2",
"io.spray" %% "spray-can" % "1.3.2",
"com.typesafe.akka" %% "akka-actor" % "2.2.3",
"com.typesafe.akka" %% "akka-slf4j" % "2.2.3",
"org.slf4j" % "slf4j-simple" % "1.6.4",
"io.spray" % "sbt-revolver" % "0.7.2"
)
I have also added the following to my plugins.sbt file
addSbtPlugin("io.spray" % "sbt-revolver" % "0.7.2")
I am using scala 2.10 with sbt version 0.13.5
My build log seems to indicate it is trying to resolve at the following location
==== Plugin Releases: tried
[warn] http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/io/spray/sbt-revolver/0.7.2/sbt-revolver-0.7.2.pom
Browsing to the following perhaps it would resolve if it was looking in
http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/io.spray/sbt-revolver/scala_2.10/sbt_0.13/0.7.2/
Any ideas as to why io.spray has turned into io/spray in the url
Any help deeply appreciated
Use this:
Resolver.url("Plugin Releases", url("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
instead of:
"Plugin Releases" at "http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases"
This sets the correct URL pattern for this repository.

Mllib dependency error

I'm trying to build a very simple scala standalone app using the Mllib, but I get the following error when trying to bulid the program:
Object Mllib is not a member of package org.apache.spark
Then, I realized that I have to add Mllib as dependency as follow :
version := "1"
scalaVersion :="2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.1.0",
"org.apache.spark" %% "spark-mllib" % "1.1.0"
)
But, here I got an error that says :
unresolved dependency spark-core_2.10.4;1.1.1 : not found
so I had to modify it to
"org.apache.spark" % "spark-core_2.10" % "1.1.1",
But there is still an error that says :
unresolved dependency spark-mllib;1.1.1 : not found
Anyone knows how to add dependency of Mllib in .sbt file?
As #lmm pointed out, you can instead include the libraries as:
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.1.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)
In sbt %% includes the scala version, and you are building with scala version 2.10.4 whereas the Spark artifacts are published against 2.10 in general.
It should be noted that if you are going to make an assembly jar to deploy your application you may wish to mark spark-core as provided e.g.
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.1.0" % "provided",
"org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)
Since the spark-core package will be in the path on executor anyways.
Here is another way to add the dependency to your build.sbt file if you're using the Databricks sbt-spark-package plugin:
sparkComponents ++= Seq("sql","hive", "mllib")