SBT: Custom commands aggregation - scala

I'm looking for a way to create a custom command in sbt for multi-project structure in a such way:
This command shouldn't do anything for the root project
This command should make some action in subprojects
Executing command for the root project should trigger execution such commands for subproject
I have this in my build.sbt
lazy val root = (project in file("."))
.settings(
name := "Custom command",
commands += Command.command("customCommand") { state =>
state
}
)
.aggregate(a, b)
lazy val a = project
.settings(
commands += Command.command("customCommand") { state =>
println("Hello")
state
}
)
lazy val b = project
.settings(
commands += Command.command("customCommand") { state =>
println("Hey there")
state
}
)
Unfortunately, I get nothing in the sbt shell:
sbt:Custom command> customCommand
sbt:Custom command>
The expected output is:
sbt:Custom command> customCommand
Hello
Hey there
sbt:Custom command>
The order doesn't matter.
So it looks like my command is executed for the root project only. Is there any way to tell sbt to execute customCommand for subproject first?

Most of the time you don't really need a command and can do everything you want with a task. You can define a task key, override it in your subprojects and the aggregating project will do what you expect (aggregate):
lazy val customTask = taskKey[Unit]("My custom task")
lazy val root = (project in file("."))
.aggregate(a, b)
lazy val a = project
.settings(
customTask := { println("Hello") }
)
lazy val b = project
.settings(
customTask := { println("Hey there") }
)
Here's how it works:
> a/customTask
Hello
[success] Total time: 0 s, completed Apr 5, 2018 9:31:43 PM
> b/customTask
Hey there
[success] Total time: 0 s, completed Apr 5, 2018 9:31:47 PM
> customTask
Hey there
Hello
[success] Total time: 0 s, completed Apr 5, 2018 9:31:50 PM
Check sbt documentation on Commands, you need it only if you need to manipulate the build state explicitly (which is hard to do carefully), call other existing commands or modify settings (which you normally shouldn't):
Typically, you would resort to a command when you need to do something that’s impossible in a regular task.
If you still need a command, I think you cannot override it in subprojects and aggregate like a task, but you can probably define it once and inside it dispatch on the project it is called from. I really doubt you want it.

Related

sbt: generating shared sources in cross-platform project

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?
Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

Per-project tasks in SBT

My .sbt file looks something like this:
lazy val common = (project in file("./common"))
.settings(
// other settings
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;assembly;foo")
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;compile;bar")
)
Additionally I have two tasks foo and bar which are only valid in their respective projects.
My tests show that upon calling build - no matter which project I am in - both aliases are being called.
And for tasks, the keys can already be only defined at top-level of the .sbt file (e.g. val foo = taskKey[Unit]("Does foo")).
I want to know how to correctly implement tasks and command aliases on project level.
Is that possible?
The problem you are having is with alias in sbt. When an alias is defined, it is attached to scope GlobalScope in the form of a command and therefore available for all sub-projects. When you do multiple definitions of aliases with addCommandAlias, the last execution wins as every executions removes previously created alias with the same name.
You can see the defined alias by running sbt alias and it will print that there is only one build alias.
You could achieve separations of build by introducing it as a taskKey
lazy val build = taskKey[Unit]("Builds")
lazy val root = (project in file("."))
.aggregate(one, two) // THIS IS NEED TO MAKE build TASK AVAILABLE IN ROOT
lazy val common = (project in file("./common"))
.settings(
//SOME KEYS
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, Compile / compile).value
}
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, assembly).value
}
)
EDIT: Added Def.sequential as suggested by #laughedelic in the comments

Sbt: How to define task for all projects?

I would like to be able to define a task for all projects in my sbt.build:
lazy val project1 = project.in(`.` / "project1)
...
lazy val project2 =
...
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
The problem is this definition works only when I call sbt upload, but I would like to be able to call it for each subproject: sbt project1/upload and sbt project2/upload.
Is there a way to do it, without using inputKey?
See Organizing the build:
For more advanced users, another way of organizing your build is to define one-off auto plugins in project/*.scala. By defining triggered plugins, auto plugins can be used as a convenient way to inject custom tasks and commands across all subprojects.
project/UploadPlugin.scala
package something
import sbt._
import Keys._
object UploadPlugin extends AutoPlugin {
override def requires = sbt.plugins.JvmPlugin
override def trigger = allRequirements
object autoImport {
val upload = taskKey[Unit]("upload a config file from project to server")
}
import autoImport._
override lazy val projectSettings = Seq(
upload := {
val n = name.value
println(s"uploading $n..")
}
)
}
build.sbt
Here's how you can use it:
ThisBuild / organization := "com.example"
ThisBuild / scalaVersion := "2.12.5"
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.aggregate(project1, project2)
.settings(
name := "Hello"
)
lazy val project1 = (project in file("project1"))
lazy val project2 = (project in file("project2"))
build.sbt does not have to mention anything about UploadPlugin, since it's a triggered plugin. From the shell you can call:
sbt:Hello> project1/upload
uploading project1..
[success] Total time: 0 s, completed Jul 20, 2018
sbt:Hello> project2/upload
uploading project2..
[success] Total time: 0 s, completed Jul 20, 2018
You can add the task as a setting of the project you want :
lazy val uploadTask = {
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
}
project.in(`.` / "project1).settings(uploadTask)

SBT plugin -- execute custom task before compilation

I've just written my first SBT Autoplugin which has a custom task that generates a settings file (if the file is not already present). Everything works as expected when the task is explicitly invoked, but I'd like to have it automatically invoked prior to compilation of the project using the plugin (without having the project modify it's build.sbt file). Is there a way of accomplishing this, or do I somehow need to override the compile command? If so, could anyone point me to examples of doing so? Any help would be extremely appreciated! (My apologies if I'm missing something simple!) Thanks!
You can define dependencies between tasks with dependsOn and override the behavior of a scoped task (like compile in Compile) by reassigning it.
The following lines added to a build.sbt file could serve as an example:
lazy val hello = taskKey[Unit]("says hello to everybody :)")
hello := { println("hello, world") }
(compile in Compile) := ((compile in Compile) dependsOn hello).value
Now, every time you run compile, hello, world will be printed:
[IJ]sbt:foo> compile
hello, world
[success] Total time: 0 s, completed May 18, 2018 6:53:05 PM
This example has been tested with SBT 1.1.5 and Scala 2.12.6.
val runSomeShTask = TaskKey[Unit]("runSomeSh", " run some sh")
lazy val startrunSomeShTask = TaskKey[Unit]("runSomeSh", " run some sh")
startrunSomeShTask := {
val s: TaskStreams = streams.value
val shell: Seq[String] = if (sys.props("os.name").contains("Windows")) Seq("cmd", "/c") else Seq("bash", "-c")
// watch out for those STDOUT , SDERR redirection, otherwise this one will hang after sbt test ...
val startMinioSh: Seq[String] = shell :+ " ./src/sh/some-script.sh"
s.log.info("set up run some sh...")
if (Process(startMinioSh.mkString(" ")).! == 0) {
s.log.success("run some sh setup successful!")
} else {
throw new IllegalStateException("run some sh setup failed!")
}
}
// or only in sbt test
// test := (test in Test dependsOn startrunSomeShTask).value
(compile in Compile) := ((compile in Compile) dependsOn startrunSomeShTask).value

SBT: How to make one task depend on another in multi-project builds, and not run in the root project?

For my multi-project build, I'm trying to create a verify task that just results in scct:test and then scalastyle being executed in order. I would like scct:test to execute for all the subprojects, but not the top-level project. (If it executes for the top-level project, I get "timed out waiting for coverage report" from scct, since there's no source and no tests in that project.) What I had thought to do was to create verify as a task with dependencies on scct:test and scalastyle. This has turned out to be fairly baroque. Here is my Build.scala from my top-level project/ directory:
object MyBuild extends Build {
val verifyTask = TaskKey[Unit]("verify", "Compiles, runs tests via scct:test and then runs scalastyle")
val scctTestTask = (test in ScctPlugin.Scct).scopedKey
val scalastyleTask = PluginKeys.scalastyleTarget.scopedKey
lazy val root = Project("rootProject",
file("."),
settings = Defaults.defaultSettings ++
ScalastylePlugin.Settings ++
ScctPlugin.instrumentSettings ++
ScctPlugin.mergeReportSettings ++
Seq(
verifyTask in Global := {},
verifyTask <<= verifyTask.dependsOn(scctTestTask, scalastyleTask)
)
) aggregate(lift_webapp, selenium_tests)
lazy val subproject_1 = Project(id = "subproject_1", base = file("subproject_1"))
lazy val subproject_2 = Project(id = "subproject_2", base = file("subproject_2"))
}
However, the verify task only seems to exist for the root project; when I run it I don't see the same task being run in the subprojects. This is exactly the opposite of what I want; I'd like to issue sbt verify and have scct:test and scalastyle run in each of the subprojects but not in the top-level project. How might I go about doing that?
solution 1: define verifyTask in subprojects
First thing to note is that if you want some task (verify, test, etc) to run in some projects, you need to define them scoped to the subprojects. So in your case, the most straightforward thing to do this is to define verifyTask in subproject_1 and subproject_2.
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.4"
lazy val verify = taskKey[Unit]("verify")
def verifySettings = Seq(
skip in verify := false,
verify := (Def.taskDyn {
val sk = (skip in verify).value
if (sk) Def.task { println("skipping verify...") }
else (test in Test)
}).value
)
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
verifySettings,
scalaVersion in ThisBuild := "2.12.4",
skip in verify := true
)
lazy val sub1 = (project in file("sub1"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
lazy val sub2 = (project in file("sub2"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
solution 2: ScopeFilter
There was a recent Reddit thread that mentioned this question, so I'll post what I've done there.
If you want to manually aggregate on some subprojects, there's also a technique called ScopeFilter.
Note that I am using sbt 1.x here, but it should work with sbt 0.13 some minor change.
lazy val packageAll = taskKey[Unit]("package all the projects")
lazy val myTask = inputKey[Unit]("foo")
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
scalaVersion in ThisBuild := "2.12.4",
packageAll := {
(packageBin in Compile).all(nonRootsFilter).value
()
},
myTask := {
packageAll.value
}
)
lazy val sub1 = (project in file("sub1"))
lazy val sub2 = (project in file("sub2"))
def nonRootsFilter = {
import sbt.internal.inc.ReflectUtilities
def nonRoots: List[ProjectReference] =
allProjects filter {
case LocalProject(p) => p != "root"
case _ => false
}
def allProjects: List[ProjectReference] =
ReflectUtilities.allVals[Project](this).values.toList map { p =>
p: ProjectReference
}
ScopeFilter(inProjects(nonRoots: _*), inAnyConfiguration)
}
In the above, myTask depends on packageAll, which aggregates (packageBin in Compile) for all non-root subprojects.
sbt:root> myTask
[info] Packaging /Users/xxx/packageall/sub1/target/scala-2.12/sub1_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Packaging /Users/xxx/packageall/sub2/target/scala-2.12/sub2_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Feb 2, 2018 7:23:23 PM
I may be wrong, but you are defining the verify task dependency only for the current project.
Maybe you can try:
Seq(
verifyTask in Global := {},
verifyTask <<= (verifyTask in Global).dependsOn(scctTestTask, scalastyleTask)
)
Or you can add the verifyTask settings to all your modules.