There is a similar question here but that solution does not work in sbt v1.x
In the build sbt it is well documented how to exclude dependencies when added through libraryDependencies:
libraryDependencies += "log4j" % "log4j" % "1.2.15" exclude("javax.jms", "jms")
or preventing transitive dependencies:
libraryDependencies += "org.apache.felix" % "org.apache.felix.framework" % "1.8.0" intransitive()
but my question is how (and if) it can be done when declaring dependsOn dependencies of submodules in a multi-module project like this:
lazy val core = project.dependsOn(util)
How would I do something like this (invalid code in example below) to prevent a transitive dependency from being brought in via util:
lazy val core = project.dependsOn(util exclude("javax.jms", "jms"))
also how, and more importantly, how to exclude a transitive dependency on another submodule in the multi-module project from being brought in via util (where sub3 is another submodule project declared in the same build.sbt):
lazy val core = project.dependsOn(util exclude sub3)
The way to do it, is to use excludeDependencies SettingKey.
An short example:
excludeDependencies ++= Seq(
ExclusionRule("commons-logging", "commons-logging")
)
Source
If you happen to define your dependencies as val (as I do), you might find it useful to define the excludes based on your dependencies. To do so, you need this simple method:
def excl(m: ModuleID): InclExclRule = InclExclRule(m.organization, m.name)
and it allows for easy exclusions:
val theLib = "com.my.lib" % "artifact" % "version"
lazy val `projectA` = (project in file("projectA"))
.settings(
...
libraryDependencies ++= Seq(
theLib
)
)
lazy val `projectB` = (project in file("projectB"))
.settings(
...
libraryDependencies ++= Seq(
...
),
excludeDependencies ++= Seq(
excl(theLib)
)
)
.dependsOn(projectA)
Related
I have multiple projects that is independent of each other.
They share multiple libraries (reactivemongo, redis cache, akka stream, etc...).
I want to build a "parent" SBT project so all of the "child" projects inherit the shared libraries with same version.
Can this be done in SBT ? can someone share a code example/documentation ?
any help is appreciated :), Thanks.
EDIT:
To be more specific:
I have 2 repositories in Github (child1, child2).
I want to create a 3rd repository called "parent", which will include one build.sbt so other repositories inherit from it.
Something like this should work:
lazy val commonSettings = libraryDependencies ++= Seq(
"org.reactivemongo" %% "reactivemongo" % "0.16.3"
)
lazy val moduleA = (project in file("moduleA"))
.settings(commonSettings)
lazy val moduleB = (project in file("moduleB"))
.settings(commonSettings)
lazy val root = (project in file(".")).settings()
.aggregate(moduleA, moduleB)
Have a look here https://www.scala-sbt.org/1.x/docs/Multi-Project.html for more.
Multi project build using sbt.
lazy val global = project
.in(file("."))
.settings(settings)
.aggregate(
common,
project1,
project2
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val project1 = project
.settings(
name := "multi1",
settings,
libraryDependencies ++= commonDependencies ++ Seq(
"org.apache.parquet" % "parquet-avro" % "1.7.0",
"org.apache.kafka" % "kafka-clients" % "0.10.1.0"
)
)
.dependsOn(
common
)
lazy val project2 = project
.settings(
name := "multi2",
settings,
libraryDependencies ++= commonDependencies ++ Seq(
"org.scalikejdbc" %% "scalikejdbc" % "2.0.0"
)
)
.dependsOn(
common
)
lazy val commonSettings = Seq(
scalacOptions ++= compilerOptions,
resolvers ++= Seq(
"Local Maven Repository" at "file://" + Path.userHome.absolutePath +
"/.m2/repository",
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots")
)
)
lazy val commonDependencies = Seq(
"org.slf4j" % "slf4j-simple" % "1.7.25",
"com.zaxxer" % "HikariCP" % "2.5.1"
"com.oracle" % "ojdbc6" % "11.2.0.4"
)
Please refer link https://github.com/pbassiner/sbt-multi-project-example for more info
Hope it will help!
I have build.sbt file:
import sbt.Keys.libraryDependencies
lazy val scalatestVersion = "3.0.4"
lazy val scalaMockTestSupportVersion = "3.6.0"
lazy val typeSafeConfVersion = "1.3.2"
lazy val scalaLoggingVersion = "3.7.2"
lazy val logbackClassicVersion = "1.2.3"
lazy val commonSettings = Seq(
organization := "com.stulsoft",
version := "0.0.1",
scalaVersion := "2.12.4",
scalacOptions ++= Seq(
"-feature",
"-language:implicitConversions",
"-language:postfixOps"),
libraryDependencies ++= Seq(
"com.typesafe.scala-logging" %% "scala-logging" % scalaLoggingVersion,
"ch.qos.logback" % "logback-classic" % logbackClassicVersion,
"com.typesafe" % "config" % typeSafeConfVersion,
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"org.scalamock" %% "scalamock-scalatest-support" % scalaMockTestSupportVersion % "test"
)
)
unmanagedJars in Compile += file("lib/opencv-331.jar")
lazy val pimage = project.in(file("."))
.settings(commonSettings)
.settings(
name := "pimage"
)
parallelExecution in Test := true
It is working fine, if I use sbt run, but I cannot run from IntelliJ.
I receive error:
java.lang.UnsatisfiedLinkError: no opencv_java331 in java.library.path
I can add manually (File->Project Structure->Libraries->+ necessary dir).
My question is: is it possible to specify build.sbt that it will automatically create IntelliJ project with specified library?
I would say try to: drag and drop the dependency into the /lib which should be in the root directory of your project, if it's not there create it.
Run commands:
sbt reload
sbt update
Lastly you could try something like:
File -> Project Structure -> Modules -> then mark all the modules usually 1 to 3, delete them (don't worry won't delete your files) -> hit the green plus sign and select Import Module -> select root directory of your project and it should then refresh it
If none of these help, I'm out of ideas.
I want to override dependency on project in certain Task.
I have a sbt multi-project which using spark.
lazy val core = // Some Project
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1"
)
val sparkLibsProvided = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided"
)
lazy val main = Project(
id = "main",
base = file("main-project"),
settings = sharedSettings
).settings(
name := "main",
libraryDependencies ++= sparkLibs,
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
)
).dependsOn(core)
When I try to make fat jar to submit on my yarn cluster, I use https://github.com/sbt/sbt-assembly task. But in this case, I want to use sparkLibsProvided instead of sparkLibs something like:
lazy val sparkProvided = (project in assembly).settings(
dependencyOverrides ++= sparkLibsProvided.toSet
)
How can I properly override this dependency?
You can create a new project which is a dedicated project for creating your spark uber jar with the provided flag:
lazy val sparkUberJar = (project in file("spark-project"))
.settings(sharedSettings: _*)
.settings(
libraryDependencies ++= sparkLibsProvided,
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
)
)
And when you assemble in sbt, go to the said project first:
sbt project sparkUberJar
sbt assembly
This can be easily achieved by using the key provided specifically for what you want:
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter {
_.data.getName == "spark-core-1.6.1.jar"
}
}
This approach is considered hacky, however, and it would be better if you managed to split your configuration into subprojects, as is also warned in official documentation here:
If you need to tell sbt-assembly to ignore JARs, you're probably doing it wrong. assembly task grabs deps JARs from your project's classpath. Try fixing the classpath first.
I would like to import Maven libraries either with the Maven's XML file or SBT's Scala file. I guess there already are the same questions out, but I could't quite find any. Thank you!
You just treat remote Maven repositories normally. Unless you want to utilize your local .m2/repository. See below for an example Build.scala using both:
object myBuild extends Build {
lazy val mainProject = Project(
id="root",
base=file("."),
settings = Project.defaultSettings ++ Seq(
name := "Root project",
scalaVersion := "2.11.4",
version := "0.1",
resolvers ++= Seq(remoteMavenRepo, localMavenRepo),
libraryDependencies ++= List(
mavenLibrary1, mavenLibrary2
)
)
)
val remoteMavenRepo = "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
val localMavenRepo = "Local Maven" at Path.userHome.asFile.toURI.toURL + ".m2/repository"
// if library folows scala version suffix convention, then we use %%
val mavenLibrary1 = "com.typesafe.slick" %% "slick" % "2.0.2"
// if it's a java library with no scala version suffix, then we use %
val mavenLibrary2 = "joda-time" % "joda-time" % "2.4"
I have a sbt build file that use 1 plugin and 3 dependencies:
scalaVersion := "2.10.4"
val reflect = Def.setting { "org.scala-lang" % "scala-reflect" % "2.10.4" }
val compiler = Def.setting { "org.scala-lang" % "scala-compiler" % "2.10.4" }
lazy val macrosSettings = Project.defaultSettings ++ Seq(
addCompilerPlugin("org.scala-lang.plugins" % "macro-paradise_2.10.4-SNAPSHOT" % "2.0.0-SNAPSHOT"),
libraryDependencies ++= {
import Dependencies._
Seq(play_json, specs2, reflect.value)
}
)
lazy val Macros = Project(id="IScala-Macros", base=file("macros"), settings=macrosSettings)
However the compiler gave me the following error in compiling IScala-Macros:
[warn] :: org.scala-lang#scala-compiler;2.10.4-SNAPSHOT: not found
[warn] :: org.scala-lang#scala-library;2.10.4-SNAPSHOT: not found
[warn] :: org.scala-lang#scala-reflect;2.10.4-SNAPSHOT: not found
this seems like a bug as I don't want them to resolve to 2.10.4-SNAPSHOT, but only 2.10.4, is it a bug of sbt? If not, where does this SNAPSHOT come from?
There are a couple of issues in this build.sbt build definition so I highly recommend reading the document Macro Paradise where you can find the link to a project that for an end-to-end example, but in a nutshell working with macro paradise is as easy as adding the following two lines to your build (granted you’ve already set up SBT to use macros).
As to the issues in this build, I don't see a reason for Def.setting for the depdendencies reflect and compiler, and moreover I'm unsure about the dependency in addCompilerPlugin. Use the one below where Def.setting is used to refer to the value of the scalaVersion setting. I still think addCompilerPlugin should follow the sample project above.
import Dependencies._
scalaVersion := "2.10.4"
val reflect = Def.setting {
"org.scala-lang" % "scala-reflect" % scalaVersion.value
}
val compiler = Def.setting {
"org.scala-lang" % "scala-compiler" % scalaVersion.value
}
lazy val macrosSettings = Project.defaultSettings ++ Seq(
addCompilerPlugin("org.scala-lang.plugins" % "macro-paradise_2.10.4-SNAPSHOT" % "2.0.0-SNAPSHOT"),
libraryDependencies ++= Seq(
play_json,
specs2,
reflect.value
)
)
lazy val Macros = Project(id="IScala-Macros", base=file("macros"), settings=macrosSettings)