I am defining multiple JVM/JS cross projects. Each one contains some common JVM/JS scala code that I want to extract into a general common project that each project can depend on. Could someone recommend me the best way to define my build.scala files for the general and dependent projects?
CrossProject supports the normal dependsOn operation you are used to. So you can:
// call to settings needed so for an implicit conversion to kick in
lazy val common = crossProject.settings()
lazy val p1 = crossProject.dependsOn(common)
lazy val p2 = crossProject.dependsOn(common)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
lazy val p1JVM = p1.jvm
lazy val p1JS = p1.js
lazy val p2JVM = p2.jvm
lazy val p2JS = p2.js
There is a full example on GitHub.
You can create Multi-project builds
Let's say you have project structure like this;
root
project/Build.scala
project1
src/
project1.sbt
project2
src/
project2.sbt
projectN
src/
projectN.sbt
You can easily define dependencies in Build.scala
lazy val root = Project(id = "Main-Project",
base = file(".")) aggregate(project1, project2,..)
lazy val project2 = Project(id = "project2",
base = file("project1")).dependsOn(project1)
...
I ended up arriving at the solution below.
lazy val common = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
)
lazy val commonJVM = common.jvm
lazy val commonJS = common.js
...
lazy val p1 = crossProject.in(file(".")).
settings(
).
jvmSettings(
).
jsSettings(
).
jvmConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJVM"))).
jsConfigure(_.dependsOn(ProjectRef(uri("../common"), "commonJS")))
lazy val p1JVM = p1.jvm.
settings(...
lazy val p1JS = p1.js.
settings(...
Related
I have a root project that depends on a subproject1. And subproject1 depends on subproject2.
Does that imply that I Can use subproject2's source code directly in root?
lazy val root =
Project(id = "root", base = file(".")).dependsOn(sub1)
lazy val sub1 =
Project(id = "sub1").dependsOn(sub2)
lazy val sub2 =
Project(id = "sub2")
Yes.
This can easily be checked.
build.sbt
name := "sbtdemo"
version := "0.1"
ThisBuild / scalaVersion := "2.13.4"
lazy val root =
Project(id = "root", base = file(".")).dependsOn(sub1)
lazy val sub1 =
Project(id = "sub1", base = file("sub1")).dependsOn(sub2)
lazy val sub2 =
Project(id = "sub2", base = file("sub2"))
sub2/src/main/scala/App.scala
object App {
def foo() = println("foo")
}
src/main/scala/Main.scala
object Main {
def main(args: Array[String]): Unit = {
App.foo() // foo
}
}
Yes. From Classpath dependencies by sbt:
lazy val core = project.dependsOn(util)
Now code in core can use classes from util. This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled.
I have a multi module sbt project. When I change some source code in a module, other modules don't see the changes in IntelliJ .
When I try to navigate, it goes to declaration, instead of navigating to the source it navigates to compiled jar file.
It works fine when I remove the jar from library dependencies in project settings. I think because it recompiles so works fine till next change. And sbt compiles works fine but I guess problem because of Build.scala settings, project dependencies can have order issues. Here is the dependencies;
lazy val root = Project(id = "xx-main", base = file("."), settings = commonSettings)
.aggregate(utils, models, commons, dao, te)
.dependsOn(utils, models, commons, dao)
lazy val utils = Project(id = "xx-utils", base = file("xx-utils"))
.settings(commonSettings: _*)
lazy val commons = Project(id = "xx-commons", base = file("xx-commons"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val models =
Project(id = "xx-models", base = file("xx-models"), settings = commonSettings)
.dependsOn(utils)
lazy val dao = Project(id = "xx-dao", base = file("xx-dao"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val te = Project(id = "xx-te", base = file("xx-te"))
.settings(commonSettings: _*)
.dependsOn(utils, models, dao, commons)
I have the following project definition (simplified):
object B extends Build {
lazy val root = (project in file("."))
.aggregate(commons, processor)
lazy val commons = (project in file("commons"))
lazy val processor = (project in file("processor"))
.enablePlugins(BuildInfoPlugin, BuildTag)
}
and the BuildTag plugin (also simplified to the issue at hand):
object BuildTag extends AutoPlugin {
override def requires = BuildInfoPlugin
override lazy val buildSettings = Seq(
packageOptions in (Compile, packageBin) += {
Package.ManifestAttributes(("buildinfo.package", (buildInfoPackage in Compile).value))
}
)
}
when I load the project, I get an error like:
Reference to undefined setting:
{.}/compile:buildInfoPackage from {.}/compile:packageBin::packageOptions
It looks like sbt is trying to reference the setting outside of the scope where the plugin is using it. Why might that be and how can I fix it?
The problem here was not the multi-module nature, because it is reproducible also in a single-module project.
However instead of
override lazy val buildSettings = ...
you need to use projectSettings to make the buildInfoPackage task usable.
I'm developing a system to discover subprojects at compile time. This works. See here. Now the only issue is that the subproject's route file is being ignored.
I know that the normal way to include a route file in the main route file is by hardcoding the latter into the former. But that would defy my goal of dynamic subprojects.
I bet that there's a way to, in Build.scala, discover a route file and append it to the main route file. But I'm a beginner and I have no idea how to do it. Could you please help me out?
Alternatively, if there's no way to do it at compile time, maybe there's a way to load it at runtime? I know there's an api to intercept requests. So if we can read the routes we could implement dynamic routing that way. Is that a good idea?
Your submodules could implement their own routing DSL. See example in api doc. Optionally, you could hook into Compile task in your root project and append all the routes to main routes file programmatically.
In the end i had to write fragments of the routing file (one per each sub-project, use a different extension like subproject.routes for example) and then concatenate them all together into a single routes file. You have to do this also for the application.conf file.
I did this via the Build.sbt script:
import sbt._
import Keys._
import play._
import java.io._
object Build extends Build {
val commonSettings: Seq[Setting[_]] = Seq(
scalaVersion := "2.11.1"
)
IO.copyFile(file("conf/base.routes"), file("conf/routes"))
IO.copyFile(file("conf/base.application.conf"), file("conf/application.conf"))
lazy val libFolder = file("base-lib");
lazy val baseLib = processModule(libFolder)
lazy val modules = (file("modules") * DirectoryFilter).get.map { dir =>
processModule(dir).dependsOn(baseLib)
}
lazy val root = (project in file("."))
.enablePlugins(PlayJava)
.settings(
name := "mainProject",
version := "1.0"
)
.dependsOn(modules map (m => m: ClasspathDependency): _*)
.aggregate(modules map (m => m: ProjectReference): _*)
override lazy val projects = root +: modules +: dspcloudLib
def processModule(dir: File):Project = {
val p = Project(dir.getName, dir).enablePlugins(PlayJava).settings(commonSettings: _*)
val mf = new File(dir, "conf/" + dir.getName + ".r")
val r = IO.read(mf)
IO.append(file("conf/routes"), r.toString)
val cf = new File(dir, "conf/" + dir.getName + ".application.conf")
val c = IO.read(cf)
IO.append(file("conf/application.conf"), c.toString)
p
}
}
I have an sbt build with 2 duplicated projects configuration. See example:
lazy val MyProjectOne = Project(id = "OneId", base = file("path/OneId"))
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
lazy val MyProjectTwo = Project(id = "TwoId", base = file("path/TwoId"))
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
It is obvious that MyProjectOne and MyProjectTwo differs only in id and base properties.
Is there a way to refactor sbt build like this:
lazy val template = Project()
.dependsOn(moduleOne)
.settings(plugin.settings: _*)
.settings(defaultSettings: _*)
.settings(webSettings: _*)
.settings(libraryDependencies ++= commonTests)
//Just as example:
lazy val MyProjectOne = Project(id = "OneId", base = file("path/OneId")).extends(template)
lazy val MyProjectTwo = Project(id = "TwoId", base = file("path/TwoId")).extends(template)
How can I do that with sbt?
Also
With maven I can define a parent project pom for that case. Is there analog in sbt?