I'm configuring SLF4J within an SBT application and the Test vs Runtime scopes are working differently than I'd expect.
The setup I want:
tests (sbt test): use slf4-simple as the implementation
bundle/production runtime (sbt run): use log4j-slf4j-impl
Relevant build.sbt (sbt 0.13) section:
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.25" % Test,
libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % 2.8.2 % Runtime,
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % 2.8.2 % Runtime,
libraryDependencies += "org.apache.logging.log4j" % "log4j-slf4j-impl" % 2.8.2 % Runtime
The error I'm getting is that there are two slf4j bindings present, the log4j one and the simple.
I'm wondering how the Runtime dependencies can be excluded from the Test scope, or if this is the wrong approach here.
To distill the question: I want to use a few different jars at runtime vs test that are mutually exclusive. How can this be done in sbt 0.13?
The issue is that Test scope includes Compile and presumably Runtime. So anything you add in Runtime it's also added in Test.
You can try to exclude log4j-slf4j-impl from the Test classpath like this:
fullClasspath.in(Test) := fullClasspath.in(Test).value.filterNot(_.data.getName.contains("log4j-slf4j-impl"))
Related
I have a library that I wish to expose in both the unit tests in Scala and the code itself.
In sbt, I added my library dependency with configuration "test" and then it's available for tests but I cannot use it in the code. If I leave the configuration be or add "compile" it's not available to be imported in unit tests.
libraryDependencies ++= Seq(
"org.scalacheck" %% "scalacheck" % "1.14.0",
"org.scalatest" %% "scalatest" % "3.0.6" % "test",
"org.scalactic" %% "scalactic" % "3.0.6" % "test")
The main problem is that I expose an abstract class I want to use all over the place in other code: abstract class UnitSpec extends FlatSpec with Matchers with ScalaCHeckDrivenPropertyChecks and also use in the tests of the library. If I add "test" to ScalaCheck it cannot find it in the main code of the library. If I leave it as is, it cannot from org.scalatestplus.scalacheck.ScalaCheckDrivenPropertyChecks. This used to be OK and work fine with 3.0.5 and GeneratorDrivenProperyChecks but that's been deprecated.
Is there a way to achieve what I want? I tried "test->compile" but that also doesn't do what I had hoped...
You can combine configurations. In order to have a library both in compile and test you just add bot configurations.
// wrong: libraryDependencies += "<organization>" %% "<module>" % "<version>" % "compile->compile" % "test->compile"
The syntax means roughly: project configuration dependsOn(->) configuration of libraryDependency.
Update
You can also add the dependency twice with different configurations.
libraryDependencies += "<organization>" %% "<module>" % "<version>",
libraryDependencies += "<organization>" %% "<module>" % "<version>" % "test"
Update 2
I think the syntax in the first example is not what I meant to provide.
libraryDependencies += "<organization>" %% "<module>" % "<version>" % "compile->compile;test->compile"
At least that is what I use in my libraryDependencies.
So you need a trait from the Scalatest JAR in non-test code. I am not sure why it worked before, but it would make sense to me just to remove % "test" from the scalatest dependency. That will make it available in compile and everything from compile is available in test too.
And for Scalactic I think the main use-case for it as a separate dependency is when you need it in compile but only use Scalatest in test (or don't use it at all). If they are both needed for tests only (or for compile), Scalatest will bring Scalactic with it.
I tried "test->compile" but that also doesn't do what I had hoped...
"test->compile" is the same as "test":
A configuration without a mapping (no "->") is mapped to "default" or "compile". The -> is only needed when mapping to a different configuration than those.
When reading build.sbt of many web applications, one can often see dependencies marked as "provided", see e.g. sbt-assembly documentation:
"org.apache.spark" %% "spark-core" % "0.8.0-incubating" % "provided"
I was unable to find any mention in SBT documentation, however Maven documentation says following about provided:
provided
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime
Sometimes however I have also seen "container" in the same position, like in this build.sbt. Is this the same thing?
val tomcatVersion = "7.0.53"
libraryDependencies ++= Seq(
"org.apache.tomcat.embed" % "tomcat-embed-core" % tomcatVersion % "container",
"org.apache.tomcat.embed" % "tomcat-embed-logging-juli" % tomcatVersion % "container",
"org.apache.tomcat.embed" % "tomcat-embed-jasper" % tomcatVersion % "container",
"org.apache.tomcat" % "tomcat-catalina" % tomcatVersion % "provided",
"org.apache.tomcat" % "tomcat-coyote" % tomcatVersion % "provided"
)
That forth element of the dependency associates the dependency with a configuration; establishing a configuration dependency. It originates with ivy, which sbt uses internally.
The "container" configuration is defined by
xsbt-web-plugin version 0.9, which is brought into the project you reference here.
It is being used to establish the container/hosting runtime for sbt container:start.
As an aside - that runtime would necessarily provide the runtime libraries corresponding to the "provided" configuration, which were used during the compile phase but not included in the transitive dependencies for the resulting artifacts.
I have a spark streaming Scala project which uses Apache NiFi receiver. The projects runs fine under Eclipse/Scala IDE and now I want to package it for deployment now.
When I add it as
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
sbt assumes it's a Scala library and tries to resolve it.
How doe I add NiFi receiver and all it's dependencies to project's SBT file?
Also, is it possible to pint dependencies to local directories instead of sbt trying to resolve?
Thanks in advance.
Here is my sbt file contents:
name := "NiFi Spark Test"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
Double % are used for adding scala version as suffix to the maven artefact. It is required because different scala compiler versions produces incompatible bytecode. If you are would like to use java library from maven, then you should use single % character
libraryDependencies += "org.apache.nifi" % "nifi-spark-receiver" % "0.3.0"
I also found that I can put libraries the project depends on into the lib folder and they will be picked up during assembly.
In my build.sbt file I have this in my project.
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
I just let it download all the libraries automatically. I'm adding graphx, the spark-core, and the scala sdk to one of my project modules but when I try to compile I'm getting:
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term hadoop
in package org.apache which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term io
in value org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term compress
in value org.apache.io which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
The weird thing is if I download graphx/mllib directly from the maven repositories it seems to compile. Any ideas?
Another possible source of error is the incorrect scalac version setting in the project. Right click project -> Open module settings -> Global Libraries, change/add the scala-sdk version appropriate to your project
Please add the hadoop dependency. Something like
libraryDependencies += "org.apache.hadoop" %% "hadoop-common" % "2.7.1"
libraryDependencies += "org.apache.hadoop" %% "hadoop-hdfs" % "2.7.1"
You may need to add other hadoop modules depending on your app.
I am trying to use scalatest, but Intellij cannot recognize:
import org.scalatest._
Here is my build.sbt file, located in the same directory as my scalatest.jar file.
scalaVersion := "2.11.2"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
Thanks
So you have by convention two source folders:
src/main/scala/...
src/test/scala/...
The first is shown blue, the second green in IntelliJ IDEA. The library dependencies in sbt are associated with either of these, so
"org.foo" % "bar_2.11" % "1.2.3"
Is a main dependency, available to main sources (and also test, because test depends on main). And
"org.foo" % "bar_2.11" % "1.2.3" % "test"
Is a test dependency, only available to test sources. The idea is that these are libraries which are not required for your product, but only to run the unit tests.
In your example, Scala-Test is only available to test sources, so trying to import it from main sources will fail.