Given a multimodule build.sbt:
ThisBuild / organization := "com.mycompany"
ThisBuild / version := "1.0.0"
ThisBuild / scalaVersion := "2.12.7"
// more global settings
The ThisBuild scope is repeated on every single line. Is there a way to do the following?
ThisBuild {
organization := "com.mycompany"
version := "1.0.0"
scalaVersion := "2.12.7"
}
Probably the closest you can get is:
inThisBuild(Seq(
organization := "com.mycompany",
version := "1.0.0",
scalaVersion := "2.12.7"
))
Related
I want multiple sbt projects with the exact same root so I can build the same code with different settings. I've tried something similar to what I have below, but sbt only recognizes the first project (root).
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("."))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.13.6"
)
lazy val root2 = (project in file("."))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.12.12"
)
This isn't a perfect answer, but I found the suggestion below on this page which does seem to work for this simple example.
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("target/root"))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.13.6",
Compile / scalaSource := baseDirectory.value / ".." / ".." / "src" / "main" / "scala",
)
lazy val root2 = (project in file("target/root2"))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.12.12",
Compile / scalaSource := baseDirectory.value / ".." / ".." / "src" / "main" / "scala",
)
I don't love this solution because it requires dummy directories and an otherwise unnecessary redefinition of scalaSource for multiple tasks (although I've only included compilation in the example above).
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
I am looking for something similar to dependencies tag in pom.xml with Maven build.
Will it make a difference if I use separate build.sbt for each of the child modules?
Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings file for the dependencies.
Using provided->provided in the dependsOn helped me solve a similar problem:
So something like:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
My objective is to write an SBT plugin which can be used by SBT 0.13.x and 1.x versions of SBT. Based on this thread and this documentation. I wrote the following build.sbt for my plugin project
lazy val foo = (project in file(".")).settings(
name := "foo",
sbtPlugin := true,
organization := "com.bar",
version := "1.0.0",
scalaVersion:= "2.12.4",
sbtVersion in Global := "1.0.0",
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
when I do sbt +publichLocal I see
info] Packaging /Users/user1/IdeaProjects/fulfillment-sbt/target/scala-2.12/sbt-0.13/foo-1.0.0-javadoc.jar ...
[info] Done packaging.
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/poms/foo.pom
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/jars/foo.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/srcs/foo-sources.jar
[info] published foo to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/docs/foo-javadoc.jar
[info] published ivy to /Users/user1/.ivy2/local/com.bar/foo/scala_2.12/sbt_0.13/1.0.0/ivys/ivy.xml
[success] Total time: 9 s, completed Apr 4, 2018 11:12:38 AM
But it didn't publish for 1.0 version of SBT. what can I do that it publishes for both versions of SBT?
I went to the gitter channel of SBT and had a conversation there with the creators of SBT. Based on that conversation I created a working example. I am listing it here so that it helps someone cross publish sbt plugins in future.
project/build.properties
sbt.version=0.13.17
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
),
scalaCompilerBridgeSource := {
val sv = appConfiguration.value.provider.id.version
("org.scala-sbt" % "compiler-interface" % sv % "component").sources
}
)
And finally in order to cross publish SBT plugins one has to do
sbt ^publishLocal
Wow. didn't know about the ^ the sbt +publishLocal is for cross publishing normal binaries not for plugins. for cross publishing of sbt plugins, we must do sbt ^publishLocal.
One thing to note is that the scalaCompilerBridgeSource thing is only needed if you are working on SBT 0.13.17. If you upgrade to SBT 1.1.0 in the plugin project then the code is simplified.
project/build.properties
sbt.version=1.1.2
build.sbt
lazy val p = (project in file(".")).settings(
name := "sbt-crosspublish",
sbtPlugin := true,
organization := "com.abhi",
version := "1.0.0",
crossScalaVersions := Seq("2.10.6", "2.12.0"),
crossSbtVersions := Seq("0.13.17", "1.0.0"),
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.3"
)
)
I want to use a library cloned from github in my machine and modified.
And I would like to test my code.
What can i do to set in my build.sbt file
name := "Actoverse Demo"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.20"
)
lazy val root = project.in(file(".")).dependsOn(actoversePlugin)
lazy val actoversePlugin = RootProject(file ( " /Users/USERNAME/Desktop/Bo/Actoverse-Scala/src/main/scala/actoverse" ))
to execute instead my local version of library ?
You can modify this
RootProject (uri ("https://github.com/45deg/Actoverse-Scala.git")
for this
RootProject (file("whateverPath")
More info here
I'm trying to use the JForex-3 SDK from Scala / SBT.
My build.sbt looks like:
name := "tmp"
version := "1.0"
scalaVersion := "2.12.1"
resolvers += "Dukascopy" at "https://www.dukascopy.com/client/jforexlib/publicrepo/"
libraryDependencies ++= Seq(
"com.dukascopy.dds2" % "DDS2-jClient-JForex" % "3.1.2",
"com.dukascopy.api" % "JForex-API" % "2.13.30"
)
When importing import com.dukascopy.api.system there is only "tester" available. I cannot figure out what happens with the rest https://www.dukascopy.com/client/javadoc3/
Can someone help here ?
Downgrading the version of the first library dependency solves the problem. Downgrade it to version 3.0.18
name := "tmp"
version := "1.0"
scalaVersion := "2.12.1"
resolvers += "Dukascopy" at "https://www.dukascopy.com/client/jforexlib/publicrepo/"
libraryDependencies ++= Seq(
"com.dukascopy.dds2" % "DDS2-jClient-JForex" % "3.0.18",
"com.dukascopy.api" % "JForex-API" % "2.13.30"
)