Per-project tasks in SBT - scala

My .sbt file looks something like this:
lazy val common = (project in file("./common"))
.settings(
// other settings
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;assembly;foo")
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;compile;bar")
)
Additionally I have two tasks foo and bar which are only valid in their respective projects.
My tests show that upon calling build - no matter which project I am in - both aliases are being called.
And for tasks, the keys can already be only defined at top-level of the .sbt file (e.g. val foo = taskKey[Unit]("Does foo")).
I want to know how to correctly implement tasks and command aliases on project level.
Is that possible?

The problem you are having is with alias in sbt. When an alias is defined, it is attached to scope GlobalScope in the form of a command and therefore available for all sub-projects. When you do multiple definitions of aliases with addCommandAlias, the last execution wins as every executions removes previously created alias with the same name.
You can see the defined alias by running sbt alias and it will print that there is only one build alias.
You could achieve separations of build by introducing it as a taskKey
lazy val build = taskKey[Unit]("Builds")
lazy val root = (project in file("."))
.aggregate(one, two) // THIS IS NEED TO MAKE build TASK AVAILABLE IN ROOT
lazy val common = (project in file("./common"))
.settings(
//SOME KEYS
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, Compile / compile).value
}
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, assembly).value
}
)
EDIT: Added Def.sequential as suggested by #laughedelic in the comments

Related

build.sbt - iteration over sub projects for common settings in monorepo

I'm implementing a monorepo using SBT. I would like to iterate over my subprojects in order to initialize them (as the have the same configuration) and prevent code duplication.
In my build.sbt:
lazy val root = (project in file("."))
.aggregate(projects: _*)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val projects = Seq("projectA", "projectB", "projectC")
.map((projectName: String) => (project in file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName))
.project
)
I'm getting the error:
error: project must be directly assigned to a val, such as `val x = project`. Alternatively, you can use `sbt.Project.apply`
Based on the error message, I also tried to use sbt.Project.apply(projectName, file(projectName)).settings(...) instead, but I'm also facing some funny errors.
From what I understand, it seems that SBT expects me to declare as lazy val projectA = (project in file("projectA")).settings(...), which works fine but I would have to duplicate this code for all my sub projects.
Is this iteration that I try to implement even possible?
Utility method might help with some of the duplication, for example
def createProject(projectName: String) = {
Project(projectName, file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName)
)
}
lazy val projectA = createProject("projectA")
lazy val projectB = createProject("projectB")
lazy val projectC = createProject("projectC")
lazy val root = (project in file("."))
.aggregate(projectA, projectB, projectB)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)

Different project on the same base-directory (or share a single file between projects)

My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)

How can I specify a mainClass in build.sbt that resides in another module?

For some reason, our project got reorganized with the main class thrown in another module
I've specified the mainClass as below in the build.sbt but I still get a class not found error:
mainClass in Compile := Some("com.so.questions.sbt.Main")
However, this is bound to fail since it's going to look for the Main class in the src folder. However, this module lives outside of (sibling of) src:
MyScalaProject
+-MyModule
|+-src
| +-com.so.questions.sbt
| +-Main
|+-build.sbt <-- build.sbt specific to this module, currently blank
+-src
| +-<other folders>
+-build.sbt <-- build.sbt currently housing all config
How can I change the project scope in build.sbt to find and correctly load the main class?
That is, is it possible to do sbt run at the top level and have the main class be found with this structure?
It should work.
The FQCN specification for mainClass should be location independent to my understanding.
The real question that comes to mind is how you are loading your sub-module.
Here are some sbt definitions that should help point you in the right direction ( replace the <> tags with your own project Ids) :
// Define a submodule ref to be able to include it as a dependency
lazy val subModuleRef = ProjectRef(file("MyModule"),<MyModule SBT NAME>)
// Define a submodule project to be able to orchestrate it's build
lazy val subModule = Project(
id = <MyModule SBT NAME>,
base = file("MyModule"),
).addSbtFiles(file("build.sbt"))
// Define the top-level project, depending and subModule Ref for code
// inclusion and aggregating the subModule for build orchestration
lazy val scalaProject = Project(
id = <MyScalaProject NAME>,
base = file("."),
aggregate = Seq(subModule),
settings = commonSettings
).dependsOn(subModuleRef).
Let's say that you have the MyModule module/folder containing the main class and some other module called MyCoreModule (just to illustrate the whole build.sbt):
// any stuff that you want to share between modules
lazy val commonSettings = Seq(
scalaVersion := "2.12.8",
version := "1.0-SNAPSHOT"
)
lazy val root = (project in file("."))
.settings(commonSettings: _*)
.settings(
name := "parent-module"
)
.aggregate(core, app)
.dependsOn(app) // <-- here is the config that will allow you to run "sbt run" from the root project
lazy val core = project.in(file("MyCoreModule"))
.settings(commonSettings: _*)
.settings(
name := "core"
)
lazy val app = project.in(file("MyModule"))
.dependsOn(core)
.settings(commonSettings: _*)
.settings(
name := "app"
)
// define your mainClass from the "app" module
mainClass in Compile := (mainClass in Compile in app).value
Btw, sbt.version=1.2.7

sbt-assembly does not pick up configuration specific settings

I am updating an old 0.7.x build file from the tool sbt that thankfully removed the reference to "simple" from its name in the meantime.
Something that once worked, does not do so any longer. I had different config entries for platform specific assembly tasks. These include specific filters that for some reason are now called assemblyExcludedJars instead of excludedJars, and specific jar names that for some reason are now called assemblyJarName instead of jarName.
Basically:
val Foo = config("foo") extend Compile
lazy val assemblyFoo = TaskKey[File]("assembly-foo")
lazy val root = Project(id = "root", base = file("."))
// .configs(Foo) // needed? doesn't change anything
.settings(
inConfig(Foo)(inTask(assembly) {
assemblyJarName := "wtf.jar"
}),
scalaVersion := "2.11.7",
assemblyFoo <<= assembly in Foo
)
Now I would expect that if I run sbt assembly-foo or sbt foo:assembly, it would produce a file wtf.jar. But I am getting the default root-assembly-0.1-snapshot.jar. The same problem happens when I try to specify assemblyExcludedJars, they are simply ignored and still included.
If I remove the inConfig it works:
lazy val root = Project(id = "root", base = file("."))
.settings(
inTask(assembly) {
assemblyJarName := "wtf.jar"
},
scalaVersion := "2.11.7",
assemblyFoo <<= assembly in Foo
)
But now I cannot use different jar names for different configurations (which is the whole point).
As described in a blog post by one of sbt's authors and the author of sbt-assembly, this should work. It was also written in this Stackoverflow question. But the example requires an antique version of sbt-assembly (0.9.0 from 2013, before auto plugins etc.) and doesn't seem to apply to the current versions.
If one defines a new configuration, one has to redefine (?) all the tasks one is about to use. Apparently for sbt-assembly, this means running baseAssemblySettings:
val Foo = config("foo") extend Compile
lazy val assemblyFoo = TaskKey[File]("assembly-foo")
lazy val root = Project(id = "root", base = file("."))
.settings(
inConfig(Foo)(baseAssemblySettings /* !!! */ ++ inTask(assembly) {
jarName := "wtf.jar"
}),
scalaVersion := "2.11.7",
assemblyFoo := (assembly in Foo).value
)
Tested with sbt 0.13.9 and sbt-assembly 0.14.1.

How to change many settings in scope at once in build.sbt?

I want to group settings by the context in which they apply in my build.sbt. Instead of
foo in ThisBuild := bar
baz in ThisBuild := bar
bar in Compile := 5
quux in Compile := 7
I'd like something that roughly looks like (don't care much about the specific syntax/API, just the conceptual grouping and lack of repetition)
in(ThisBuild) {
foo := bar
baz := bar
}
in(Compile) {
bar := 5
quux := 7
}
Is such a thing possible/clean/idiomatic?
For Compile configuration use inConfig as follows:
inConfig(Compile) {
Seq(
bar := 5,
quux := 7
)
}
For ThisBuild scope use inScope as follows:
inScope(ThisScope.copy(project=Select(ThisBuild))) {
Seq(
foo := bar,
baz := bar
)
}
Note that all the in* methods work for Seqs.
Note also that since they appear in build.sbt file they're automatically scoped to the project the build definition belongs to. Obviously, in ThisBuild settings belong to the build.
As to the idiomatic nature of using the in* methods in *.sbt files, I have never seen them in the files. Since they're pretty new (since sbt 0.13) people could be yet to catch up with their use.
The trait ProjectExtra offers inConfig, inTask and inScope methods. As #copumpkin pointed out in the comment there's no inProject method that would complement the others. I've no idea why, but with inScope it is not necessary and neither are the inConfig and inTask since they pretty much map a scope to a config or a task with the other axes unchanged.
For reference, this is the definition of Scope:
final case class Scope(project: ScopeAxis[Reference], config: ScopeAxis[ConfigKey], task: ScopeAxis[AttributeKey[_]], extra: ScopeAxis[AttributeMap])
Defaults.scala
As a supplement to Jacek's answer, I'd encourage you to see sbt 0.13.5's JvmPlugin.scala and Defaults.scala. JvmPlugin is an auto plugin that's enabled by default and adds basic JVM tasks like compile:compile and test:test.
Defaults.defaultConfigs
For instance, Defaults.defaultConfigs adds both compile:compile and test:test as follows:
lazy val defaultConfigs: Seq[Setting[_]] =
inConfig(Compile)(compileSettings) ++
inConfig(Test)(testSettings) ++
inConfig(Runtime)(Classpaths.configSettings)
Defaults.testSettings
As Jacek noted, the expected type on the passed in settings are all Seq[Def.Setting[_]]. Like
lazy val testSettings: Seq[Setting[_]] = configSettings ++ testTasks
Defaults.testTasks
These sequences can be constructed by appending reused sequences that are scoped in different ways as testTasks:
lazy val testTasks: Seq[Setting[_]] =
testTaskOptions(test) ++
testTaskOptions(testOnly) ++
testTaskOptions(testQuick) ++
testDefaults ++ Seq(
....
test := {
val trl = (testResultLogger in (Test, test)).value
val taskName = Project.showContextKey(state.value)(resolvedScoped.value)
trl.run(streams.value.log, executeTests.value, taskName)
},
testOnly <<= inputTests(testOnly),
testQuick <<= inputTests(testQuick)
)
Project scoping and auto plugin
If you're using multi-project build.sbt, the project scoping would be done by passing the settings to project's settings method.
lazy val root = (project in file("."))
lazy val foo = project.
settings(yourSettings: _*).
settings(someOtherSettings: _*)
lazy val bar = project.
settings(yourSettings: _*).
settings(someOtherSettings: _*)
You can define def yourSettings in build.sbt (or project/foo.scala). When you start to find passing settings explicitly to be tedious, you can define an autoplugin similar to sbt's JvmPlugin. This would let you write:
lazy val root = (project in file("."))
lazy val foo = project.
enablePlugins(FooPlugin, BarPlugin)
lazy val bar = project.
enablePlugins(FooPlugin, BarPlugin)
Edit:
If you make FooPlugin and BarPlugin triggered out of thin air, you could write:
lazy val root = (project in file("."))
lazy val foo, bar = project
See Plugins for the details.
I want to clarify that you can chose the degree of magic/automation when it comes to auto plugins. In the above, both FooPlugin and BarPlugin are triggered out of nothing, but you could easily make BarPlugin triggered by FooPlugin or not triggered at all. This is meant to automate things like "use coffeescript plugin for all web-related projects". I hope I'm still on-topic with your question "I want to group settings by the context."