Brand new to sbt (using 1.x) and trying to get this setup working as intended, but I seem to be missing something.
I have a nested directory structure as follows:
ProjectRoot/
AggregationPoint/
DockerSubproject/
Each directory has its own build.sbt, and the root is something like this:
lazy val root = (project in file("."))
.settings(name...)
.aggregate(aggregationPoint, fizz, buzz)
lazy val aggregationPoint = (project in file("AggregationPoint"))
.settings(commonSettings)
.enablePlugins(DockerPlugin, GitVersioning)
// other projects using commonSettings are omitted to avoid distraction
lazy val commonSettings = Seq(...)
DockerSubproject builds an image, and that part is working fine. However, the config for tagging the image correctly (defined in commonSettings) isn't being passed from ProjectRoot, through AggregationPoint, and into DockerSubproject. It works if I remove AggregationPoint from the equation (and just point at DockerSubproject directly) so I know the config values themselves are correct and that DockerSubproject picks them up.
AggregationPoint/build.sbt looks something like this:
lazy val aggregationPoint = (project in file("."))
.settings(name...)
.aggregate(dockerSubproject)
lazy val dockerSubproject = (project in file("DockerSubproject"))
.enablePlugins(DockerPlugin, GitVersioning)
Do I need to access the passed-in settings somehow and explicitly call .settings(...) again? Is what I'm wanting to do even possible? Is there a best-practices way of handling this sort of thing?
Thank you for your time.
Related
In my build.sbt (nothing fancy) ....
val common: Project =
project
.in(file("common"))
.enablePlugins(...)
.settings(libraryDependencies ++= ...)
val root =
project
.in(file("."))
.enablePlugins(...)
.dependsOn(common)
.aggregate(
common,
....
)
.settings(...)
Problem
common does not compile before root, and so the compilation fails complaining that it cannot find those classes (in common)
FYI
I have tried a multitude of things that I came across when searching some information on this problem (google, documentation, github issues etc.) No luck.
sbt v1.4.9 (Play project sbt-play v2.7.9)
The build.sbt is much bigger than what you see above (dependencies, tasks etc.). Otherwise, I don't think if there's anything special or tricky about it.
Help much appreciated!
To avoid initialisation problems try declaring projects as lazy vals
lazy val common = ...
lazy val root = ...
instead of strict vals
val common = ...
val root = ...
As a side note use dependsOn to establish ordering between subprojects, and not aggregate, because aggregate will not modify classpaths.
I'd agree with Mario Galic on using lazy val. In fact, I'd recommend using lazy val at all times in build.sbt.
If there is a cycle, like common referring back to root, one technique you can use is to use LocalProject("project name"), like LocalProject("root"):
lazy val common = (project in file("common"))
.settings(
Test / test := (LocalProject("root") / Test / test).value
)
To compile the same source code with Scala.js and Scala JVM the documentation says to use crossProject, for instance
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
However it looks that the same goal can be achieved setting up the modules manually
lazy val fooSettings = Seq(
name := "foo",
version := "0.1-SNAPSHOT",
scalaSource in Compile := baseDirectory.value / ".." / "shared" / "src" / "main" / "scala"
)
lazy val fooJVM = project.in(file("jvm"))
.settings(fooSettings: _*)
lazy val fooJS = project.in(file("js"))
.settings(fooSettings: _*)
.enablePlugins(ScalaJSPlugin)
Does crossProject do something important or it's just more convenient way to setup stuff?
It looks like, in your manual setup, you have separate jvm and js subdirectories under your "shared" area -- it doesn't look like they're the same code at all.
CrossProject is about letting you have a single directory structure, with the same code, that is compiled for both sides. Possibly with some small subdirectories that are specific to one side or another, but in general the focus is that most of the code is shared -- the jvm and js subdirectories, if present (it depends on the mode), are usually just shims to adapt the common code to those sides.
All that said, it is really just to make it more convenient -- I believe you could achieve the same results without CrossProject. But in the common case where you're sharing a lot of code, it's significantly more convenient...
I'm working with a huge project with lots of subprojects, some of them with subprojects of their own. On top of that, I'd like some of them to be dynamic - given a List somewhere in the project build, I'd like to create one project for each of the elements.
For those reasons, having to define a lazy val for each project in build.sbt is very cumbersome. Is there other way to declare projects, like a addProject-like method we could call anywhere? Is there some SBT plugin that helps with that?
Sbt uses macros to turns top level vals into projects, so I don't think you will be able to escape that part. However, you can define all you build in Project => Project functions: (note that you also composability "for free" with function composition)
def myConf: Project => Project =
_.enablePlugins(ScalaJSPlugin)
.settings(scalaVersion := "2.12.0")
Then simply use project.configure(myConf) for single line project definitions:
lazy val subProject1 = project.configure(myConf)
lazy val subProject2 = project.configure(myConf)
lazy val subProject3 = project.configure(myConf)
lazy val subProject4 = project.configure(myConf)
...
I am trying to accomplish something like this:
lazy val customFlag = settingKey[Boolean]("My custom flag")
lazy val depOne = project ...
lazy val depTwo = project ...
lazy val myproject = project
.settings(
customFlag := false)
.dependsOn(if (customFlag) depOne else depTwo)
The idea being, that I could then use set customFlag := true in the sbt console in order to change whether project myproject depends on sub-project one or two.
I have a hunch at this point that the answer is that this is not possible. But it would be nice to get confirmation or an alternative to accomplish something similar.
No. It's not possible to use setting key in the dependsOn.
I have a simple multi project, whereby the root aggregates projects a and b. The root project loads this plugin I'm writing that is supposed to allow easy integration with the build system in our company.
lazy val a = project in file("a")
lazy val b = project in file("b")
Now, I'd like to define some Settings in the plugin that don't make sense in the root project, and can have different values for each sub-project. However, if I just define them like
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
override def projectSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
)
}
then sbt will complain that ourResolver cannot be defined because kind is not defined in the root project.
Is there a way to specify a scope for this setting (ourResolver), such that the setting would be defined in every aggregated project except the root project?
Or do I have to make it into an SettingKey[Option[_]] and set it to None by default?
Edit: I have quite a large set of settings which progressively depend on kind and then ourResolver and so on, and these settings should only be defined where (read: "in the project scopes where") kind is defined. Added ourResolver to example code to reflect this.
However, if I just define them like....
There's nothing really magical about setting keys. Keys are just String entry tied with the type. So You can define your key the way you're doing now just fine.
Is there a way to specify a scope for this setting, such that the setting would be defined in every aggregated project except the root project?
A setting consists of the following four things:
scope (project, config, task)
a key
dependencies
an ability to evaluate itself (def evaluate(Settings[Scope]): T)
If you did not specify otherwise, your settings are already scoped to a particular project in the build definition.
lazy val a = (project in file("a")).
settings(commonSettings: _*).
settings(
name := "a",
kind := Kind.App,
libraryDependencies ++= aDeps(scalaVersion.value)
)
lazy val b = (project in file("b")).
settings(commonSettings: _*).
settings(
name := "b",
kind := Kind.Lib,
libraryDependencies ++= bDeps(scalaVersion.value)
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "foo",
publishArtifact := false
).
aggregate(a, b)
In the above, kind := Kind.App setting is scoped to project a. So that would answer your question literally.
sbt will complain that kind is not defined in the root project.
This is the part I'm not clear what's going on. Does this happen when you load the build? Or does it happen when you type kind into the sbt shell? If you're seeing it at the startup that likely means you have a task/setting that's trying to depend on kind key. Don't load the setting, or reimplement it so something that doesn't use kind.
Another way to avoid this issue may be to give up on using root aggregation. Switch into subproject and run tasks or construct explicit aggregation using ScopeFilter.
I've managed to ultimately do this by leveraging derive, a method that creates DerivedSettings. These settings will then be automatically expanded by sbt in every scope where their dependencies are defined.
Therefore I can now write my plugin as:
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
def derivedSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
) map (s => Def.derive(s, trigger = _ != streams.key))
override def projectSettings = derivedSettings ++ Seq( /* other stuff */ )
}
And if I define kind in a and b's build.sbt, then ourResolver will also end up being defined in those scopes.