I have an sbt build with 2 duplicated projects configuration. See example:
lazy val MyProjectOne = Project(id = "OneId", base = file("path/OneId"))
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
lazy val MyProjectTwo = Project(id = "TwoId", base = file("path/TwoId"))
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
It is obvious that MyProjectOne and MyProjectTwo differs only in id and base properties.
Is there a way to refactor sbt build like this:
lazy val template = Project()
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
//Just as example:
lazy val MyProjectOne = Project(id = "OneId", base = file("path/OneId")).extends(template)
lazy val MyProjectTwo = Project(id = "TwoId", base = file("path/TwoId")).extends(template)
How can I do that with sbt?
Also
With maven I can define a parent project pom for that case. Is there analog in sbt?
Related
I have a multi-project build with a build.sbt that looks as follows:
import lmcoursier.CoursierConfiguration
import lmcoursier.definitions.Authentication
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.12"
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://adoMavenHost/adoOrganization/adoProject/_packaging/${adoRepoIdWithView.replace("#", "%40")}/maven/v1")
)
val adoAuthentication =
Authentication(user = adoMavenUsername, password = adoMavenPassword)
.withOptional(false)
.withHttpsOnly(true)
.withPassOnRedirect(false)
val coursierConfiguration = {
val initial =
CoursierConfiguration()
.withResolvers(adoMavenRepos)
.withClassifiers(Vector("", "sources"))
.withHasClassifiers(true)
adoMavenRepos.foldLeft(initial) {
case (conf, repo) ⇒
conf.addRepositoryAuthentication(repo.name, adoAuthentication)
}
}
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := coursierConfiguration,
updateClassifiers / csrConfiguration := coursierConfiguration
)
lazy val root = (project in file("."))
.settings(mainSettings: _*)
.settings(
name := "sbt-test",
).aggregate(core, util)
lazy val core = (project in file("core"))
.settings(mainSettings: _*)
.settings(
name := "core",
).dependsOn(util)
lazy val util = (project in file("util"))
.settings(mainSettings: _*)
.settings(
name := "util"
)
For some reason, coursier attempts to download the util package externally during the core/update task. This is not what I want, as it should resolve it internally as part of the project. The package is not added to libraryDependencies, so I'm baffled why it would attempt the download.
The above example will fail because the Azure DevOps credentials are and Maven repository are incorrect, but it shows the attempt to download util.
It seems somehow related to this Github issue.
The default CoursierConfiguration constructor sets the interProjectDependencies property to an empty Vector. To fix this, manually add resolvers on top of sbt's csrConfiguration taskKey using .withResolvers.
This is what the solution looks like applied to my question, largely based on this Github comment:
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenHost = "pkgs.dev.azure.com"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://$adoMavenHost/adoOrganization/adoProject/_packaging/$adoRepoIdWithView/maven/v1")
)
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := {
val resolvers = csrResolvers.value ++ adoMavenRepos
val conf = csrConfiguration.value.withResolvers(resolvers.toVector)
val adoCredentialsOpt = credentials.value.collectFirst { case creds: DirectCredentials if creds.host == adoMavenHost => creds }
val newConfOpt = adoCredentialsOpt.map { adoCredentials =>
val auths =
resolvers
.collect {
case repo: MavenRepository if repo.root.startsWith(s"https://$adoMavenHost/") => {
repo.name ->
Authentication(adoCredentials.userName, adoCredentials.passwd)
}
}
auths.foldLeft(conf) { case (conf, (repoId, auth)) => conf.addRepositoryAuthentication(repoId, auth) }
}
newConfOpt.getOrElse(conf)
},
updateClassifiers / csrConfiguration := coursierConfiguration
)
I have a multi-project sbt build using Build.scala.
I want to cross-compile one subproject against Scala 2.12 and 2.11, but the other subprojects have dependencies that don't support 2.12 yet, so I just want 2.11.
How can I do this in sbt?
I've tried this:
val scalaVer = "2.11.8"
val scalaVer12 = "2.12.0"
lazy val basicSettings = Seq(
scalaVersion := scalaVer
)
lazy val root = (project in file("."))
.settings(basicSettings: _*)
.aggregate(scalajack, scalajack_dynamodb, scalajack_mongo)
lazy val scalajack = project.in(file("core"))
.settings(basicSettings: _*)
.settings(Seq(crossScalaVersions := Seq(scalaVer, scalaVer12)))
lazy val scalajack_dynamodb = project.in(file("dynamodb"))
.settings(basicSettings: _*)
.dependsOn( scalajack )
lazy val scalajack_mongo = project.in(file("mongo"))
.settings(basicSettings: _*)
.dependsOn( scalajack )
This will build everything correctly but ignores my wish to have a 2.12 version of 'core' subproject.
I build packages with:
sbt> + package
I have a multi module sbt project. When I change some source code in a module, other modules don't see the changes in IntelliJ .
When I try to navigate, it goes to declaration, instead of navigating to the source it navigates to compiled jar file.
It works fine when I remove the jar from library dependencies in project settings. I think because it recompiles so works fine till next change. And sbt compiles works fine but I guess problem because of Build.scala settings, project dependencies can have order issues. Here is the dependencies;
lazy val root = Project(id = "xx-main", base = file("."), settings = commonSettings)
.aggregate(utils, models, commons, dao, te)
.dependsOn(utils, models, commons, dao)
lazy val utils = Project(id = "xx-utils", base = file("xx-utils"))
.settings(commonSettings: _*)
lazy val commons = Project(id = "xx-commons", base = file("xx-commons"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val models =
Project(id = "xx-models", base = file("xx-models"), settings = commonSettings)
.dependsOn(utils)
lazy val dao = Project(id = "xx-dao", base = file("xx-dao"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val te = Project(id = "xx-te", base = file("xx-te"))
.settings(commonSettings: _*)
.dependsOn(utils, models, dao, commons)
I am defining multiple JVM/JS cross projects. Each one contains some common JVM/JS scala code that I want to extract into a general common project that each project can depend on. Could someone recommend me the best way to define my build.scala files for the general and dependent projects?
CrossProject supports the normal dependsOn operation you are used to. So you can:
// call to settings needed so for an implicit conversion to kick in
lazy val common = crossProject.settings()
lazy val p1 = crossProject.dependsOn(common)
lazy val p2 = crossProject.dependsOn(common)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
lazy val p1JVM = p1.jvm
lazy val p1JS = p1.js
lazy val p2JVM = p2.jvm
lazy val p2JS = p2.js
There is a full example on GitHub.
You can create Multi-project builds
Let's say you have project structure like this;
root
project/Build.scala
project1
src/
project1.sbt
project2
src/
project2.sbt
projectN
src/
projectN.sbt
You can easily define dependencies in Build.scala
lazy val root = Project(id = "Main-Project",
base = file(".")) aggregate(project1, project2,..)
lazy val project2 = Project(id = "project2",
base = file("project1")).dependsOn(project1)
...
I ended up arriving at the solution below.
lazy val common = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
...
lazy val p1 = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
).
jvmConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJVM"))).
jsConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJS")))
lazy val p1JVM = p1.jvm.
settings(...
lazy val p1JS = p1.js.
settings(...
I am converting a single-project build.sbt to a multi-project build.sbt, which is always a PITA. There is this obscure syntax to make plugin settings available. E.g. before
seq(appbundle.settings: _*)
How do I do this with sub-projects. E.g.
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ Seq(
seq(appbundle.settings: _*), // ???
name := "views",
description := ...
)
)
This just gives me an error
found : Seq[sbt.Def.SettingsDefinition]
required: Seq[sbt.Def.Setting[_]]
settings = commonSettings ++ Seq(
^
Add them using ++ to the overall settings
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ appbundle.settings ++ Seq(
name := "views",
description := ...
)
)