I want to define a compound task in sbt so that all tasks that are run in my CI job can be executed in a single commmand. For example at the moment I am running:
clean coverage test scalastyle coverageReport package
However I'd like to just run
ci
Which would effectively be an alias to all of the above tasks. Furthermore I'd like to define this in a scala file (as opposed to build.sbt) so I can include it in an already existing common scala plugin and thus it becomes availbale to all my projects.
So far (after much reading of the docs) I've managed to get a task that depends just on scalastyle by doing:
lazy val ci = inputKey[Unit]("Prints 'Runs All tasks for CI")
ci := {
val scalastyleResult = (scalastyle in Compile).evaluated
println("In the CI task")
}
however if I attempt to add another task (say the publish task) e.g:
ci := {
val scalastyleResult = (scalastyle in Compile).evaluated
val publishResult = (publish in Compile).evaluated
println("In the CI task")
}
this fails with:
[error] [build.sbt]:52: illegal start of simple expression
[error] [build.sbt]:55: ')' expected but '}' found.
My first question is whether this approach is indeed the correct way to define a compound task.
If this is the case, then how can I make the ci task depend on all the tasks mentioned.
lazy val ci = inputKey[Unit]("Prints 'Runs All tasks for CI")
ci := {
Put a blank space between statements
lazy val ci = inputKey[Unit]("Prints 'Runs All tasks for CI")
ci := {
Also, know that SBT will run your dependent tasks of ci in parallel. Sometimes this is good, but not always, for example in your clean.
There are several ways to run tasks in sequence.
One way:
commands += Command.command("ci") {
"clean" ::
"coverage" ::
"test" ::
"scalastyle" ::
"coverageReport" ::
_
}
Related
I know, I saw Run custom task automatically before/after standard task but it seems outdated. I also found SBT before/after hooks for a task but it does not have any code example.
I am on SBT 0.13.17.
So I want to run my task MyBeforeTask and MyAfterTask automatically after an other tasks, says Compile.
So when you do sbt compile I would like to see:
...log...
This is my before test text
...compile log...
This is my after test text
So I would need to have:
object MyPlugin extends AutoPlugin {
object autoImport {
val MyBeforeTask = taskKey[Unit]("desc...")
val MyAfterTask = taskKey[Unit]("desc...")
}
import autoImport._
override def projectSettings: Seq[Def.Setting[_]] = {
MyBeforeTask := {
println("This is my before test text")
},
MyAfterTask := {
println("This is my after test text")
}
}
}
So I think I need things like dependsOn and in but I am not sure how to set them up.
It is not possible to configure for a particular task to run after the given task, because that's not how the task dependencies model works - when you specify the task, its dependencies and itself will be executed, but there is no way to define an "after" dependency. However, you can simulate that with dynamic tasks.
To run some task before another, you can use dependsOn:
compile in Compile := (compile in Compile).dependsOn(myBeforeTask).value
This establishes a dependency between two tasks, which ensures that myBeforeTask will be run before compile in Compile.
Note that there is a more generic way to make multiple tasks run one after another:
aggregateTask := Def.sequential(task1, task2, task3, task4).value
Def.sequential relies on the dynamic tasks machinery, which sets up dependencies between tasks at runtime. However, there are some limitations to this mechanism, in particular, you cannot reference the task being defined in the list of tasks to execute, so you can't use Def.sequential to augment existing tasks:
compile in Compile := Def.sequential(myBeforeTask, compile in Compile).value
This definition will fail at runtime with a strange error message which basically means that you have a loop in your task dependencies graph. However, for some use cases it is extremely useful.
To run some task after another, however, you have to resort to defining a dynamic task dependency using Def.taskDyn:
compile in Compile := Def.taskDyn {
val result = (compile in Compile).value
Def.task {
val _ = myAfterTask.value
result
}
}.value
Def.taskDyn accepts a block which must return a Def.Initialize[Task[T]], which will be used to instantiate a task to be run later, after the main body of Def.taskDyn completes. This allows one to compute tasks dynamically, and establish dependencies between tasks at runtime. As I said above, however, this can result in very strange errors happening at runtime, which are usually caused by loops in the dependency graph.
Therefore, the full example, with both "before" and "after" tasks, would look like this:
compile in Compile := Def.taskDyn {
val result = (compile in Compile).value
Def.task {
val _ = myAfterTask.value
result
}
}.dependsOn(myBeforeTask).value
I found the an SBT-recipe for parameters and Build Environment.
I would now like to be able to change buildEnv while running SBT. Basically
I can't manage to find a programmatic solution for:
> set every buildEnv := BuildEnvPlugin.autoImport.BuildEnv.Development or running BuiltinCommands.set from a wrapping command.
My basic solution doesn't scale to sub-/aggregated projects
val devCmd = Command.command("dev"){ state =>
Project extract state appendWithSession (Seq(buildEnv := BuildEnv.Development), state)
}
How can I change all aggregated settings as well?
I just didn't find this simple solution initially:
override def projectSettings: Seq[Setting[_]] = commands += devCmd
lazy val devCmd = BasicCommands
.newAlias("dev", "set every buildEnv := BuildEnvPlugin.autoImport.BuildEnv.Development")
I start some docker containers before executing run to start my play-framework project:
run in Compile := (run in Compile dependsOn(dockerComposeUp)).evaluated
Now I'd like to tear down all docker containers using dockerComposeDown when play stops. Any ideas on how to accomplish on this?
I've already gone through Doing something after an input task, but that starts the containers and immediatly stops them again. (In fact it even stops the containers before starting them.) Here is what I tried:
run in Compile := {
(run in Compile dependsOn(dockerComposeUp)).evaluated
dockerComposeDown.value
}
A different approach is to call your docker task sequentially to run task. You could achieve this as described below:
lazy val testPrint = taskKey[Unit]("showTime")
testPrint := {
println("Test print.")
}
lazy val testRun = taskKey[Unit]("test build")
testRun := {
Def.sequential((runMain in Compile).toTask(" com.mycompany.MainClass "), testPrint).value
}
First define the testPrint task which in your case could be the dockerTask and then define testRun which will run both tasks sequentially. To run this just do sbt testRun. After execution it should print out "Test print."
I have a custom plugin that generates some resource artifacts in a task genWro. My plugin also sets:
resourceGenerators in Compile <+= genWro in Compile
that takes care of creating the resources during packaging.
I want to then take the resulting Seq[java.util.File] that resourceGenerators returns and upload them to S3 during sbt-native-packager's stage task.
I'd actually like to only generate these resources and upload them during stage, and not during packaging - but one thing at a time. Basically, I only want to generate and upload the files during my production build which calls sbt clean stage and skip it during local development which I only need to call sbt run.
Given the comment where you said "My main concern is how to get the Seq[File] result from the task into a custom task that uploads those files to S3, and to have that custom task called only if stage is invoked.", the answer could be as follows.
You've got the stage task that comes from the sbt-native-packager plugin.
> help stage
Create a local directory with all the files laid out as they would be in the final distribution.
You've also another task, say s3deploy, that transfers resourceGenerators to S3.
lazy val s3deploy = taskKey[Unit]("Deploys files to S3")
s3deploy := {
println(s"Files to transfer: ${(resourceGenerators in Compile).value}")
}
And here comes a solution - wire s3deploy to stage:
(stage in Universal) := {
val _ = s3deploy.value
(stage in Universal).value
}
The entire build.sbt follows:
import com.typesafe.sbt.packager.Keys._
packageArchetype.java_application
lazy val s3deploy = taskKey[Unit]("Deploys files to S3")
s3deploy := {
println(s"Files to transfer: ${(resourceGenerators in Compile).value}")
}
(stage in Universal) := {
val _ = s3deploy.value
(stage in Universal).value
}
I have a subproject named oppenheimer in my project. It's very simple to run this project from the sbt console.
[myproject] $ oppenheimer/run
I can also pass in a command line argument as such:
[myproject] $ oppenheimer/run migrate
[myproject] $ oppenheimer/run clean
How can I do this from build.sbt? Is it possible to define a task that does this? It would suffice to have something like this:
val customMigrate = ...
val customClean = ...
And this is so that I could use it elsewhere in the project, like such:
(test in Test) <<= (test in Test).dependsOn(customMigrate)
The answer is given in the sbt FAQ section "How can I create a custom run task, in addition to run?". Basically:
lazy val customMigrate = taskKey[Unit]("custom run task")
fullRunTask(customMigrate, Test, "foo.Main", "migrate")