How to override SettingKey for computing another SettingKey in sbt? - scala

I want to override the value of a SettingKey b only when computing SettingKey a1.
import sbt._
import sbt.Keys._
object Build extends Build {
val a1Key = SettingKey[String]("a1", "")
val a2Key = SettingKey[String]("a2", "")
val bKey = SettingKey[String]("b", "")
lazy val rootProject = Project("P", file(".")).settings(
bKey := "XXX",
a1Key <<= bKey((x) => ">>>"+x+"<<<"),
a2Key <<= bKey((x) => ">>>"+x+"<<<")
) .settings(
bKey in a1Key := "YYY" //providing custom value in setting scope
)
}
Current result is
> a1
[info] >>>XXX<<<
> a2
[info] >>>XXX<<<
> b
[info] XXX
...but I'm aiming at seeing YYY as the value of a1:
> a1
[info] >>>YYY<<<
> a2
[info] >>>XXX<<<
> b
[info] XXX
Better real world example than above is when you want to add some resources to your build only in runtime configuration, and some other resources when the application is packaged. For example building GWT app public resources served by server during development-mode and during production are different. It would be nice for example to customize setting resource-directories for run and package tasks.

You need to set a1Key and a2Key to allow for bKey to be overridden in the first place:
lazy val rootProject = Project("P", file(".")).settings(
bKey := "Fnord",
a1Key <<= (bKey in a1Key)(x => ">>>" + x + "<<<"),
a2Key <<= (bKey in a2Key)(x => ">>>" + x + "<<<")
).settings(
bKey in a1Key := "Meep"
)
That way, computing a1Key will use the more specific value Meep and while computing a2Key, sbt would "look for" the definition of bKey in a2Key and then, because it doesn't "find" it, falls back to the more general bKey (in the default scope), which is defined and therefore used.
Edit: this unfortunately means, that unless whoever provides the definitions of the a1Key and a2Key settings also explicitly provides the required extension points (in the form of setting-scoped dependencies), you cannot override the dependencies. That is at least how I understand it.

Related

sbt: generating shared sources in cross-platform project

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?
Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

get SBT settings from ModuleID

How can I use a moduleID: ModuleID for a "sibling" project to access settings keys?
I'm writing an SBT plugin for multi-module builds.
I have project A (which dependsOn B) and project B.
Both projects have my-own generate and mybuild tasks as settings keys.
The mybuild task consumes the value from generate - this works fine.
B doesn't depend upon anything, so B's mybuild only needs the key for B:generate and all is well.
I want A's mybuild to consume both A:generate and B:generate based on the fact that A dependsOn B in the build.sbt file.
The only promising key(s) I've found return the projects as : ModuleID instances, so is there some way to get a list of settings keys from a ModuleID?
... or should I be doing this another way?
Solution (Kind of)
Whth #himos help this ...
(myTaskKey in myConfig) := {
loadedBuild.value.allProjectRefs.find(_._1 == thisProjectRef.value).map(_._2) match {
case Some(myCurrentProject) =>
if (myCurrentProject.dependencies.nonEmpty)
sys.error {
myCurrentProject.dependencies
.map {
myDependsOnProject: ClasspathDep[ProjectRef] =>
(myDependsOnProject.project / myConfig / myTaskKey).value
// https://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
}
.foldLeft("mine.dependencies:")(_ + "\n\t" + _)
}
}
}
... sort of works.
It causes an error that implies I've accessed the correct object, even if the SBT macros don't like it.
I think ModuleID that you mention relates to dependency management, not sub projects.
For taking sub project setting/task keys project scope can be used:
(generate in A).value
(generate in B).value
More comprehensive example:
name := "A"
version := "1.0"
scalaVersion := "2.12.5"
val generate = TaskKey[String]("generate")
val myBuild = TaskKey[String]("myBuild")
val a = (project in file(".")).settings(Seq(
generate := "A_generate"
))
val b = (project in file("proj_b")).settings(Seq(
generate := "B_generate",
myBuild := (generate in a).value + "_" + generate.value
)).dependsOn(a)
Sbt console output:
sbt:A> show b/myBuild
[info] A_generate_B_generate

SBT 0.13 Build.scala References to undefined settings

I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.

sbt - basic local plugin setup?

I have a particular task I'd like to automate as part of a build process, but I'm stuck at grammar stage with sbt. I'm trying to do a helloworld-ish task using two local projects, one the plugin and one a test using that plugin, but I can't get the new task in the plugin (sampleIntTask) to be available when using sbt on the test project.
I have the following in the filesystem:
/plugin/
Plugin.scala
build.sbt
/test-using-plugin/
build.sbt
/project/plugins.sbt
For my helloworld-ish plugin: in Plugin.scala :
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
}
in plugin/build.sbt:
sbtPlugin := true
name := "myPlugin"
version := "0.1"
scalaVersion := "2.10.3"
and for testing it: in test-using-plugin/build.sbt:
name := "test-test-test"
version := "0.1"
scalaVersion := "2.10.3"
and in test-using-plugin/project/plugins.sbt:
lazy val root = project.in( file(".") ).dependsOn( testPlugin )
lazy val testPlugin = file("/Users/cap10/gitprojects/processing")
When I /test-project$ sbt sampleIntTask, I get:
[info] Set current project to test-using-plugin (in build ...)
> sampleIntTask
[error] Not a valid command: sampleIntTask
[error] Not a valid project ID: sampleIntTask
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: sampleIntTask (similar: compileInputs)
[error] sampleIntTask
[error] ^
I feel like this is about the right level of complexity for this test (define plugin project config, define plugin project behavior, define test project config, add dependency on plugin project), but I'd be unsurprised if I'm totally off based on the grammar as I can't make heads or tails of the sbt intro.
build.sbt
If you do not need to share the settings across multiple builds, you can just add your settings to test-using-plugin/custom.sbt:
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
and forget about the plugin.
Local plugin way
I haven't tested the other parts, but your Plugin.scala is wrong.
The setting expression needs to be in a setting sequence:
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
lazy val baseMyPluginSettings: Seq[sbt.Def.Setting[_]] = Seq(
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
)
lazy val myPluginSettings: Seq[sbt.Def.Setting[_]] = baseMyPluginSettings
}
And in your test-using-plugin/build.sbt add:
myPluginSettings
If you have to share settings across the builds, you can make a plugin like this or put them in global sbt file. The use of global sbt should be limited to user-specific settings and commands, so that's out. Personally, I would publish the plugin locally using publishLocal so it doesn't depend on specific file path. You can use the locally published plugin like any other plugins:
addSbtPlugin("com.example" % "myPlugin" % "0.1" changing())
By using "-SNAPSHOT" version or by calling changing(), sbt will check for the latest.

SBT artifact in custom task and scope ignored?

In SBT, if I have a task that is supposed to generate a zip/jar/war containing a bunch of files, I'd use the Defaults.packageTaskSettings method to set up that task. It'd look as follows:
object BuildDef extends Build {
val makeThings = TaskKey[File]("make-things")
val defaultMakeSettings = (baseDirectory) map { base => Seq(
(base / "thingA") -> "thingy",
(base / "thingB") -> "thingz"
)}
val project = Project("stuff", file("."))
.settings(Defaults.packageTaskSettings(makeThings, defaultMakeSettings): _*)
.settings(
artifact in makeThings <<= moduleName{ Artifact(_, "zip", "zip") }
)
}
That works just fine, and generates stuff_2.9.2-0.1-SNAPSHOT.zip in target folder.
Now I want to make an alternate version of the make-things task, that runs in a different scope, e.g. run proguard and then package things slightly differently. I've added the following settings to the BuildDef object:
val Scope = config("scope")
val project = ...
.settings(...)
.settings(
Defaults.packageTaskSettings(makeThings in Scope, defaultMakeSettings): _*
)
.settings(
artifact in (Scope, makeThings) <<=
moduleName{ n => Artifact(n+".scoped", "zip", "zip") }
)
When I run scope:make-things it seems to ignore that setting and use the old one:
> show scope:make-things
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
Why is it ignoring my settings? I hoped it'd have been generating stuff.scoped_2.9.2-0.1-SNAPSHOT.zip instead.
For more info...
> show scope:make-things::artifact
[info] Artifact(stuff.scoped,zip,zip,None,List(),None,Map())
> show scope:make-things::artifact-path
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
I realize that I could probably directly change artifactPath, but I am going off of what the xsbt-web-plugin does for its package-war task, and it doesn't touch the artifactPath. I'd like to do this the "right" way.
I ended up figuring this out almost as soon as I posted the question. The key was using the inConfig method to wrap the package settings, like this:
.settings(
artifact in (Scope, makeThings) <<= moduleName{Artifact(_,"zip","zip")}
)
.settings(
inConfig(Scope){
Defaults.packageTaskSettings(makeThings, defaultMakeSettings)
}: _*
)
I also discovered that the packageTaskSettings will modify my artifact by appending the name of the config, as long as I specify my artifact setting before the packageTaskSettings. Now I get an artifact path of
...target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT-scope.zip