My build.sbt file (sbt version is 0.13.8):
lazy val commonSettings = Seq(
version := "1.0.0",
scalaVersion := "2.11.6"
)
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "myapp",
libraryDependencies ++= Seq(
"com.typesafe.play" % "play-json" % "2.3.4",
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
"junit" % "junit" % "4.12" % "test"
)
)
scalacOptions ++= Seq("-unchecked", "-feature", "-deprecation")
I get this error when trying to compile my project:
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: com.typesafe.play#play-json_2.11;2.3.4: not found
[error] Total time: 0 s, completed Apr 17, 2015 5:59:28 PM
How can I get this play-json library for my scala 2.11.6?
You need to tell sbt which scala version should use.
You can either be explicit:
"com.typesafe.play" % "play-json_2.11" % "2.3.4",
Or use %% (sbt doc) as follows to tell sbt to use scalaVersion :
"com.typesafe.play" %% "play-json" % "2.3.4",
You can see all of com.typesafe.play's play-json versions here. They don't have a 2.3.4 version; try using 2.4.0-M3 instead.
"com.typesafe.play" %% "play-json" % "2.4.0-M3"
Mind the double %% so scalaVersion is used properly to resolve the dependency.
Related
Sorry I am fairly new to Scala and SBT. Here is my build.sbt file
name := "test_stream"
version := "0.1"
scalaVersion := "2.12.10"
resolvers in ThisBuild += Resolver.bintrayRepo("streetcontxt", "maven")
mainClass in Compile := Some("basepackage.Main")
enablePlugins(JavaAppPackaging)
enablePlugins(DockerPlugin)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-stream" % "2.6.1",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.693",
"com.streetcontxt" %% "kcl-akka-stream" % "2.0.3",
"me.maciejb.snappyflows" %% "snappy-flows" % "0.2.0",
"org.xerial.snappy" % "snappy-java" % "1.1.7.3",
"org.apache.hadoop" % "hadoop-common" % "2.10.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
And I get the following error:
sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-common;2.10.0: Resolution failed several times for dependency: org.apache.hadoop#hadoop-common;2.10.0 {compile=[default(compile)]}::
try removing ~/.sbt and ~/.ivy2 and run again
My sbt file looks as follows
organization := "scala"
name := "MyProject"
version := "1.0"
scalaVersion := "2.12.1"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "2.20.0"
libraryDependencies += "commons-net" % "commons-net" % "3.6"
libraryDependencies += "commons-validator" % "commons-validator" % "1.6.0"
When I run sbt compile I get this
sbt.librarymanagement.ResolveException: unresolved dependency: commons-validator#commons-validator;1.6.0: not found
However when I change scala version to 2.11.7, sbt compiles fine. What am I missing? How can I make it work for 2.12.1?
According to mvnrepo (https://mvnrepository.com/artifact/commons-validator/commons-validator)
use libraryDependencies += "commons-validator" % "commons-validator" % "1.6"
I am building a SBT multi-project project, which has common module and logic module, and logic.dependsOn(common).
In common, SparkSQL 2.2.1 ("org.apache.spark" %% "spark-sql" % "2.2.1") is introduced. In logic, SparkSQL also is used, however I get compilation errors, saying "object spark is not a member of package org.apache".
Now, if I add SparkSQL dependency to logic as "org.apache.spark" %% "spark-sql" % "2.2.1", it works. However if I add "org.apache.spark" %% "spark-sql" % "2.2.1" % Provided", I get the same error.
I don't get why this happened, why not the dependency can't be transitive from common to logic
here is the root sbt files:
lazy val commonSettings = Seq(
organization := "...",
version := "0.1.0",
scalaVersion := "2.11.12",
resolvers ++= Seq(
clojars,
maven_local,
novus,
twitter,
spark_packages,
artima
),
test in assembly := {},
assemblyMergeStrategy in assembly := {...}
)
lazy val root = (project in file(".")).aggregate(common, logic)
lazy val common = (project in file("common")).settings(commonSettings:_*)
lazy val logic = (project in file("logic")).dependsOn(common).settings(commonSettings:_*)
here is the logic module sbt file:
libraryDependencies ++= Seq(
spark_sql.exclude("io.netty", "netty"),
embedded_elasticsearch % "test",
scalatest % "test"
)
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-annotation" % "2.6.5",
"org.json4s" %% "json4s-jackson" % "3.2.11"
)
assemblyJarName in assembly := "***.jar"
Runtime classpath according to 'show runtime:fullClasspath' contains only target/scala-2.11/classes and ~/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar.
compile:fullClasspath contains all libraryDependencies jar locations under ~/.ivy2/cache. Why is this? I am getting java.lang.NoClassDefFoundError on sbt run.
build.sbt:
name := "my-server"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= List(
"com.typesafe.slick" %% "slick" % "3.1.0" % "provided",
"com.twitter.finatra" %% "finatra-http" % "2.1.0" % "provided",
"com.roundeights" %% "hasher" % "1.2.0" % "provided",
"com.twitter" %% "util-logging" % "6.29.0" % "provided"
)
resolvers +=
"Twitter" at "http://maven.twttr.com"
resolvers ++= Seq("RoundEights" at "http://maven.spikemark.net/roundeights")
sbt run results:
Exception in thread "main" java.lang.NoClassDefFoundError: com/twitter/logging/Logger
sbt version 0.13.8
Removing "provided" was the fix here - I was using it incorrectly to resolve ambiguous subversions of dependencies (credit to pfn from freenode #scala)
Following is the core of the project/build.sbt for a scalatra/spark project :
val ScalaVersion = "2.11.6"
val ScalatraVersion = "2.4.0-RC2-2"
// ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true)}
lazy val project = Project (
"keywordsservlet",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
// "org.scala-lang" % "scala-reflect" % ScalaVersion,
"org.apache.spark" % "spark-core_2.11" % "1.4.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.10.v20150310" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided"
),
Here is the sbt output: notice it is loading a 2.10 target !
$ sbt
[info] Loading project definition from /shared/keywords/project
[info] Updating {file:/shared/keywords/project/}keywords-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /shared/keywords/project/target/scala-2.10/sbt-0.13/classes...
[info] Set current project to KeywordsServlet (in build file:/shared/keywords/)
So what is happening here?
There is a difference between the version of Scala that you are using for your project and the version of Scala that sbt itself uses.
sbt 0.13 can compile 2.9, 2.10 and 2.11 (and 2.12). However, when it compiles your build.sbt or Build.scala files, sbt 0.13 uses Scala 2.10.
Similarly, all of the plugins that sbt uses are compiled with 2.10.
On the other hand, sbt 0.12 used Scala 2.9.