I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.
Related
My project has multiple subprojects, and I use sbt-tpolecat1 in this project.
I use a Java framework in my code. This framework use fluent interface heavily so I need to suppress many "discarded non-Unit value" warnings in my code.
This sbt-tpolecat provided a lot of useful scalac options out of the box, and I just want to exclude the -Wvalue-discard scalac option for my use case.
The problem is I have 4-5 subprojects2 in this project. And now I need to add the below to every subproject's settings:
sub_project_name.settings(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
)
// or
sub_project_name.settings(valueDiscardSetting)
lazy val valueDiscardSetting =
Seq(scalacOptions ~= (_.filterNot(Set("-Wvalue-discard"))))
Is there a way to exclude this option in all subprojects in a DRY way?
My current subprojects hierarchy is similar to this:
App -> Frontend -> Common
-> Backend -> Common
Common settings val
There is a common practice of factoring out common settings in multi-project builds
define a sequence of common settings in a val and add them to each
project. Less concepts to learn that way.
for example
lazy val commonSettings = Seq(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard"))),
...
)
lazy val util = (project in file("util")).settings(commonSettings)
lazy val core = (project in file("core")).settings(commonSettings)
Common settings auto plugin
Auto plugins can set settings for every project. Create the following small plugin under project/CommonSettingsPlugin.scala
object CommonSettingsPlugin extends AutoPlugin {
override def requires = plugins.JvmPlugin
override def trigger = allRequirements
override lazy val projectSettings = Seq(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
)
}
The override
override def requires = plugins.JvmPlugin
should effectively enable the plugin without having to call explicitly enablePlugin in build.sbt.
Override settings with onLoad
onLoad happens at the end after all projects are built and loaded.
lazy val settingsAlreadyOverridden = SettingKey[Boolean]("settingsAlreadyOverridden","Has overrideSettings command already run?")
settingsAlreadyOverridden := false
commands += Command.command("removeScalacOptions") { state =>
if (settingsAlreadyOverridden.value) {
state
} else {
Project.extract(state).appendWithSession(
Seq(
settingsAlreadyOverridden := true,
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
),
state
)
}
}
onLoad in Global := (onLoad in Global).value andThen ("removeScalacOptions" :: _)
Also consider how they addressed the problem in community-build via removeScalacOptions.
I have the following project definition (simplified):
object B extends Build {
lazy val root = (project in file("."))
.aggregate(commons, processor)
lazy val commons = (project in file("commons"))
lazy val processor = (project in file("processor"))
.enablePlugins(BuildInfoPlugin, BuildTag)
}
and the BuildTag plugin (also simplified to the issue at hand):
object BuildTag extends AutoPlugin {
override def requires = BuildInfoPlugin
override lazy val buildSettings = Seq(
packageOptions in (Compile, packageBin) += {
Package.ManifestAttributes(("buildinfo.package", (buildInfoPackage in Compile).value))
}
)
}
when I load the project, I get an error like:
Reference to undefined setting:
{.}/compile:buildInfoPackage from {.}/compile:packageBin::packageOptions
It looks like sbt is trying to reference the setting outside of the scope where the plugin is using it. Why might that be and how can I fix it?
The problem here was not the multi-module nature, because it is reproducible also in a single-module project.
However instead of
override lazy val buildSettings = ...
you need to use projectSettings to make the buildInfoPackage task usable.
I'm trying to create a task in sbt that will output the full classpath of a custom Configuration, but I get an undefined setting error when sbt tries to load the project definition. I can't figure out which setting has to be defined:
import sbt.Keys._
import sbt._
object FoobarBuild extends Build {
lazy val ZK = config("zk")
lazy val fcp = TaskKey[String]("fcp", "create formatted classpath")
lazy val fcpTask = fcp <<= (fullClasspath in ZK) map { cp =>
println(cp.files.absString)
cp.files.absString
}
lazy val project = Project("foobar", file(".")).
configs(ZK).
settings(
name := "foobar",
version := "1.0",
scalaVersion := "2.11.7"
).
settings(fcpTask)
}
Error:
[info] Loading project definition from foobar/project
Reference to undefined setting:
zk:fullClasspath from *:fcp (/Users/gaston/mesosphere/foobar/project/Build.scala:7)
zk:fullClasspath on the 7th line of this file is, of course, fullClasspath in ZK. It's undefined because it isn't set or inherited from any other config, I believe.
I hit a MissingRequirementError when I try to invoke scaladoc from within an sbt task.
Using any version of sbt 0.13.x, start with this build.sbt:
val scaladoc = taskKey[Unit]("run scaladoc")
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
settings.usejavacp.value = true
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
Then run sbt scaladoc, and behold (during makeUniverse):
[info] Set current project to test (in build file:...)
scala.reflect.internal.MissingRequirementError: object scala.annotation.Annotation in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
What is wrong here? I've already tried fork := true and different combinations of sbt/scala versions to no avail.
It seems you need to provide scala-library (and indeed, any other dependencies) directly to the DocFactory.
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
val dependencyPaths = (update in Compile).value
.select().map(_.absolutePath).mkString(java.io.File.pathSeparator)
settings.classpath.append(dependencyPaths)
settings.bootclasspath.append(dependencyPaths)
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
I'm writing an SBT Plugin that adds a Command and would like users to be able to configure this Command by setting variables in their build.sbt. What is the simplest way to achieve this?
Here is an simplified example of what the Plugin looks like:
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
override lazy val settings = Seq(commands += Command.args("mycommand", "myarg")(myCommand))
def myCommand = (state: State, args: Seq[String]) => {
//Logic for command...
state
}
}
I would like someone to be able to add the follow to their build.sbt file:
newSetting := "light"
How do I make this available as a String variable from inside the myCommand Command above?
Take a look at the example here: http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
In this example, a task and setting are defined:
val newTask = TaskKey[Unit]("new-task")
val newSetting = SettingKey[String]("new-setting")
val newSettings = Seq(
newSetting := "test",
newTask <<= newSetting map { str => println(str) }
)
A user of your plugin could then provide their own value for the newSetting setting in their build.sbt:
newSetting := "light"
EDIT
Here's another example, closer to what you're going for:
Build.scala:
import sbt._
import Keys._
object HelloBuild extends Build {
val newSetting = SettingKey[String]("new-setting", "a new setting!")
val myTask = TaskKey[State]("my-task")
val mySettings = Seq(
newSetting := "default",
myTask <<= (state, newSetting) map { (state, newSetting) =>
println("newSetting: " + newSetting)
state
}
)
lazy val root =
Project(id = "hello",
base = file("."),
settings = Project.defaultSettings ++ mySettings)
}
With this configuration, you can run my-task at the sbt prompt, and you'll see newSetting: default printed to the console.
You can override this setting in build.sbt:
newSetting := "modified"
Now, when you run my-task at the sbt prompt, you'll see newSetting: modified printed to the console.
EDIT 2
Here's a stand-alone version of the example above: https://earldouglas.com/ext/stackoverflow.com/questions/17038663/
I've accepted #James's answer as it really helped me out. I moved away from using a Commands in favour of a Task (see this mailing list thread). In the end my plugin looked something like this:
package packge.to.my.plugin
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
import MyKeys._
object MyKeys {
val myTask = TaskKey[Unit]("runme", "This means you can run 'runme' in the SBT console")
val newSetting = SettingKey[String]("newSetting")
}
override lazy val settings = Seq (
newSetting := "light",
myTask <<= (state, newSetting) map myCommand
)
def myCommand(state: State, newSetting: String) {
//This code runs when the user types the "runme" command in the SBT console
//newSetting is "light" here unless the user overrides in their build.sbt (see below)
state.log.info(newSetting)
}
}
To override the newSetting in the build.sbt of a project that uses this plugin:
import packge.to.my.plugin.MyKeys._
newSetting := "Something else"
The missing import statement had me stuck for a while!