I am building compound API docs using the sbt-unidoc plugin. I am linking to various projects on GitHub using sbt's RootProject constructor with URIs.
One of these projects has mutually exclusive sub-projects:
val foo = RootProject(uri("git://github.com/Foo/Bar.git#v1.0.0"))
That is, in this project foo, there are two sub-modules foo-db1 and foo-db2, which contain the same types but are built against different dependency library versions.
Now when I try to run unidoc, it fails because a type is defined twice from unidoc's perspective. From the documentation, I can see there is a filter function:
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(app)
But how do I create an identifier for a sub-module from my RootProject?
In other words, how would I do this:
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject --
inProjects(foo).SELECT_SUB_PROJECT(foo-v1)
?
The answer is found in this entry: Instead of using RootProject, one creates a number of ProjectRef instances.
import UnidocKeys._
val fooURI = uri("git://github.com/Foo/Bar.git#v1.0.0")
val fooDb1 = ProjectRef(fooURI, "foo-db1") // sub-module I don't want
val fooDb2 = ProjectRef(fooURI, "foo-db2") // sub-module I want
val root = (project in file("."))
.settings( ... )
.settings(unidocSettings: _*)
.settings(site.settings ++ ghpages.settings: _*)
.settings(
unidocProjectFilter in (ScalaUnidoc, unidoc) :=
inAnyProject -- inProjects(fooDb1)
)
.aggregate(fooDb2)
Note that putting only fooDb2 in the aggregate and leaving fooDb1 out, does not have any effect. It seems that unidoc always includes all sub-modules, no matter, so one has to use the unidocProjectFilter to explicitly remove projects, even if they don't appear in the aggregate.
Related
In my build.sbt (nothing fancy) ....
val common: Project =
project
.in(file("common"))
.enablePlugins(...)
.settings(libraryDependencies ++= ...)
val root =
project
.in(file("."))
.enablePlugins(...)
.dependsOn(common)
.aggregate(
common,
....
)
.settings(...)
Problem
common does not compile before root, and so the compilation fails complaining that it cannot find those classes (in common)
FYI
I have tried a multitude of things that I came across when searching some information on this problem (google, documentation, github issues etc.) No luck.
sbt v1.4.9 (Play project sbt-play v2.7.9)
The build.sbt is much bigger than what you see above (dependencies, tasks etc.). Otherwise, I don't think if there's anything special or tricky about it.
Help much appreciated!
To avoid initialisation problems try declaring projects as lazy vals
lazy val common = ...
lazy val root = ...
instead of strict vals
val common = ...
val root = ...
As a side note use dependsOn to establish ordering between subprojects, and not aggregate, because aggregate will not modify classpaths.
I'd agree with Mario Galic on using lazy val. In fact, I'd recommend using lazy val at all times in build.sbt.
If there is a cycle, like common referring back to root, one technique you can use is to use LocalProject("project name"), like LocalProject("root"):
lazy val common = (project in file("common"))
.settings(
Test / test := (LocalProject("root") / Test / test).value
)
To compile the same source code with Scala.js and Scala JVM the documentation says to use crossProject, for instance
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
However it looks that the same goal can be achieved setting up the modules manually
lazy val fooSettings = Seq(
name := "foo",
version := "0.1-SNAPSHOT",
scalaSource in Compile := baseDirectory.value / ".." / "shared" / "src" / "main" / "scala"
)
lazy val fooJVM = project.in(file("jvm"))
.settings(fooSettings: _*)
lazy val fooJS = project.in(file("js"))
.settings(fooSettings: _*)
.enablePlugins(ScalaJSPlugin)
Does crossProject do something important or it's just more convenient way to setup stuff?
It looks like, in your manual setup, you have separate jvm and js subdirectories under your "shared" area -- it doesn't look like they're the same code at all.
CrossProject is about letting you have a single directory structure, with the same code, that is compiled for both sides. Possibly with some small subdirectories that are specific to one side or another, but in general the focus is that most of the code is shared -- the jvm and js subdirectories, if present (it depends on the mode), are usually just shims to adapt the common code to those sides.
All that said, it is really just to make it more convenient -- I believe you could achieve the same results without CrossProject. But in the common case where you're sharing a lot of code, it's significantly more convenient...
I'm working with a huge project with lots of subprojects, some of them with subprojects of their own. On top of that, I'd like some of them to be dynamic - given a List somewhere in the project build, I'd like to create one project for each of the elements.
For those reasons, having to define a lazy val for each project in build.sbt is very cumbersome. Is there other way to declare projects, like a addProject-like method we could call anywhere? Is there some SBT plugin that helps with that?
Sbt uses macros to turns top level vals into projects, so I don't think you will be able to escape that part. However, you can define all you build in Project => Project functions: (note that you also composability "for free" with function composition)
def myConf: Project => Project =
_.enablePlugins(ScalaJSPlugin)
.settings(scalaVersion := "2.12.0")
Then simply use project.configure(myConf) for single line project definitions:
lazy val subProject1 = project.configure(myConf)
lazy val subProject2 = project.configure(myConf)
lazy val subProject3 = project.configure(myConf)
lazy val subProject4 = project.configure(myConf)
...
Scopes matters in sbt. And I'm completely OK with it. But there are also delegating rules that allows you build a hierarchical structure of settings. I'd like to use it to bring extra settings to more specific rules.
import sbt._
import Keys._
object TestBuild extends Build {
val sourceExample = settingKey[Seq[String]]("example source for setting dependency")
val targetExample = settingKey[Seq[String]]("example of a dependent setting")
override lazy val settings = super.settings ++ Seq (
sourceExample := Seq("base"),
targetExample := "extended" +: sourceExample.value,
sourceExample in Test += "testing"
)
}
The example gives me unexpected output:
> show compile:sourceExample
[info] List(base)
> show test:sourceExample
[info] List(base, testing)
> show compile:targetExample
[info] List(extended, base)
> show test:targetExample
[info] List(extended, base)
I expect test:targetExample be List(extended, base, testing) not List(extended, base). Once I've get the result I immediately figure out why exactly it works as shown. test:targetExample delegates from *:targetExample the calculated value but not the rule for calculating it in nested scope.
This behavior brings two difficulties for me writing my own plugin. I have extra work to define same rules in every scope as a plugin developer. And I have to memorize scope definitions of internal tasks to use it correctly as user.
How can I overcome this inconvenience? I'd like to introduce settings in call-by-name semantic instead of call-by-value. What tricks may work for it?
P.S. libraryDependencies in Test looks much more concise that using % test.
I should make clear that I perfectly understand that the sbt derives values just as it is described in documentation. It works as the creator intended it to work.
But why should I obey to the rules? I see them completely counter-intuitive. Sbt introduces inheritance semantic that actually works unlike how inheritance used to be defined. When you write
trait A { lazy val x : Int = 5 }
trait B extends A { lazy val y : Int = x * 2}
trait C extends A { override lazy val x : Int = 3 }
you expect (new B with C).y be 6, not 10. Knowing that it would be actually 10 allows you to use this kind of inheritance correctly but leaves your with desire to find more conventional means for implementing inheritance. You may even write your own implementation based on name->value dictionary. And you may proceed further according to the tenth rule of programming.
So I'm searching for a hack that would bring inheritance semantic in accordance with common one. As a start point I may suggest writing command to scan all settings and push them from parents to children explicitly. And than invoke this command automatically each time sbt runs.
But it seems too dirty for me, so I'm curios if there is more graceful way to achieve similar semantic.
The reason for the "incorrect" value is that targetExample depends on sourceExample in Compile scope as in:
targetExample := "extended" +: sourceExample.value
Should it use sourceExample value from Test scope, use in Test to be explicit about your wish as follows:
targetExample := "extended" +: (sourceExample in Test).value
Use inspect to know the dependency chain.
BTW, I strongly advise using build.sbt file for such a simple build definition.
You could have default settings, and reuse it in different configurations as described in Plugins Best Practices - Playing nice with configurations. I believe, this should give you a semantic similar to what you're looking for.
You can define your base settings and reuse it in different configurations.
import sbt._
import Keys._
object TestBuild extends Build {
val sourceExample = settingKey[Seq[String]]("example source for setting dependency")
val targetExample = settingKey[Seq[String]]("example of a dependent setting")
override lazy val settings = super.settings ++
inConfig(Compile)(basePluginSettings) ++
inConfig(Test)(basePluginSettings ++ Seq(
sourceExample += "testing" // we are already "in Test" here
))
lazy val basePluginSettings: Seq[Setting[_]] = Seq (
sourceExample := Seq("base"),
targetExample := "extended" +: sourceExample.value
)
}
PS. Since you're talking about writing your plugin, you may also want to look at the new way of writing sbt plugins, namely AutoPlugin, as the old mechanism is now deprecated.
I have a simple multi project, whereby the root aggregates projects a and b. The root project loads this plugin I'm writing that is supposed to allow easy integration with the build system in our company.
lazy val a = project in file("a")
lazy val b = project in file("b")
Now, I'd like to define some Settings in the plugin that don't make sense in the root project, and can have different values for each sub-project. However, if I just define them like
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
override def projectSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
)
}
then sbt will complain that ourResolver cannot be defined because kind is not defined in the root project.
Is there a way to specify a scope for this setting (ourResolver), such that the setting would be defined in every aggregated project except the root project?
Or do I have to make it into an SettingKey[Option[_]] and set it to None by default?
Edit: I have quite a large set of settings which progressively depend on kind and then ourResolver and so on, and these settings should only be defined where (read: "in the project scopes where") kind is defined. Added ourResolver to example code to reflect this.
However, if I just define them like....
There's nothing really magical about setting keys. Keys are just String entry tied with the type. So You can define your key the way you're doing now just fine.
Is there a way to specify a scope for this setting, such that the setting would be defined in every aggregated project except the root project?
A setting consists of the following four things:
scope (project, config, task)
a key
dependencies
an ability to evaluate itself (def evaluate(Settings[Scope]): T)
If you did not specify otherwise, your settings are already scoped to a particular project in the build definition.
lazy val a = (project in file("a")).
settings(commonSettings: _*).
settings(
name := "a",
kind := Kind.App,
libraryDependencies ++= aDeps(scalaVersion.value)
)
lazy val b = (project in file("b")).
settings(commonSettings: _*).
settings(
name := "b",
kind := Kind.Lib,
libraryDependencies ++= bDeps(scalaVersion.value)
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "foo",
publishArtifact := false
).
aggregate(a, b)
In the above, kind := Kind.App setting is scoped to project a. So that would answer your question literally.
sbt will complain that kind is not defined in the root project.
This is the part I'm not clear what's going on. Does this happen when you load the build? Or does it happen when you type kind into the sbt shell? If you're seeing it at the startup that likely means you have a task/setting that's trying to depend on kind key. Don't load the setting, or reimplement it so something that doesn't use kind.
Another way to avoid this issue may be to give up on using root aggregation. Switch into subproject and run tasks or construct explicit aggregation using ScopeFilter.
I've managed to ultimately do this by leveraging derive, a method that creates DerivedSettings. These settings will then be automatically expanded by sbt in every scope where their dependencies are defined.
Therefore I can now write my plugin as:
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
def derivedSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
) map (s => Def.derive(s, trigger = _ != streams.key))
override def projectSettings = derivedSettings ++ Seq( /* other stuff */ )
}
And if I define kind in a and b's build.sbt, then ourResolver will also end up being defined in those scopes.