Set task settings from build.sbt - scala

I am writing a small sbt plugin to generate some files which should be configurable by a target path parameter. Therefore I wrote this plugin code:
object GeneratorPlugin extends AutoPlugin {
object autoImport {
val targetPath = settingKey[String]["target directory"]
val generateFiles = taskKey[Unit]["generate files"]
}
import autoImport._
override def trigger = allRequirements
override lazy val buildSettings = Seq(
targetPath := ".",
generateFiles := generateTask
)
lazy val generateTask = Def.task {
System.out.println(targetPath.value)
}
}
When importing this using addSbtPlugin in project/plugins.sbt and running it with sbt generateFiles is correctly printing .. However when I change the value of targetPath in my build.sbt the result does not change.
targetPath := "/my/new/path"
Result of sbt generateFiles is still ..
Is there a way to change the value of targetPath within my build.sbt when importing the plugin?

You can change it like so:
targetPath in ThisBuild := "/my/new/path"
or in the sbt 1.1's new slash syntax
ThisBuild / targetPath := "/my/new/path"

Related

Need to provide a SettingKey from a plugin I use in my sbt plugin

I am using the s3 resolver plugin and would like to override it in my AutoPlugin.
I have tried added the value to projectSettings and globalSettings.
Error
not found: value s3CredentialsProvider
[error] s3CredentialsProvider := s3CredentialsProviderChain
Code
lazy val s3CredentialsProviderChain = {bucket: String =>
new AWSCredentialsProviderChain(
new EnvironmentVariableCredentialsProvider(),
CustomProvider.create(bucket)
)
}
override lazy val projectSettings = Seq(
publishTo := {
if (Keys.isSnapshot.value) {
Some("my-snapshots" at "s3://rest-of-stuff")
} else {
Some("my-releases" at "s3://rest-of-stuff")
}
},
s3CredentialsProvider := s3CredentialsProviderChain
)
The plugin code I'm working on does not define any custom settings of it's own thus has no autoImport of it's own.
Update
I have been unable to resolve the fm.sbt.S3ResolverPlugin in MyPlugin and the code won't compile.
I have tried adding it to enablePlugins on MyPlugin's build.sbt as well as adding it to the dependencies like this:
libraryDependencies ++= Seq(
"com.amazonaws" % "aws-java-sdk-sts" % amazonSDKVersion,
"com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.17.0"
)
I get an error from sbt which I've asked below:
sbt fails to resolve a plugin as dependency
If you create an AutoPlugin in project directory. You need to add this to plugins.sbt.
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
If you create an independent plugin, add this to build.sbt of the plugin
sbtPlugin := true
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
autoImport does not work in scala files that are compiled for sbt, ie plugins for example. You have specify imports statements as in simple scala program. Something like this
import fm.sbt.S3ResolverPlugin
import sbt._
object TestPlugin extends AutoPlugin {
override def requires = S3ResolverPlugin
override def trigger = allRequirements
override def projectSettings: Seq[Def.Setting[_]] = Seq(
S3ResolverPlugin.autoImport.s3CredentialsProvider := ???
)
}
Note that to enable TestPlugin, you have to call enablePlugins(S3ResolverPlugin) in build.sbt

Sbt: How to define task for all projects?

I would like to be able to define a task for all projects in my sbt.build:
lazy val project1 = project.in(`.` / "project1)
...
lazy val project2 =
...
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
The problem is this definition works only when I call sbt upload, but I would like to be able to call it for each subproject: sbt project1/upload and sbt project2/upload.
Is there a way to do it, without using inputKey?
See Organizing the build:
For more advanced users, another way of organizing your build is to define one-off auto plugins in project/*.scala. By defining triggered plugins, auto plugins can be used as a convenient way to inject custom tasks and commands across all subprojects.
project/UploadPlugin.scala
package something
import sbt._
import Keys._
object UploadPlugin extends AutoPlugin {
override def requires = sbt.plugins.JvmPlugin
override def trigger = allRequirements
object autoImport {
val upload = taskKey[Unit]("upload a config file from project to server")
}
import autoImport._
override lazy val projectSettings = Seq(
upload := {
val n = name.value
println(s"uploading $n..")
}
)
}
build.sbt
Here's how you can use it:
ThisBuild / organization := "com.example"
ThisBuild / scalaVersion := "2.12.5"
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.aggregate(project1, project2)
.settings(
name := "Hello"
)
lazy val project1 = (project in file("project1"))
lazy val project2 = (project in file("project2"))
build.sbt does not have to mention anything about UploadPlugin, since it's a triggered plugin. From the shell you can call:
sbt:Hello> project1/upload
uploading project1..
[success] Total time: 0 s, completed Jul 20, 2018
sbt:Hello> project2/upload
uploading project2..
[success] Total time: 0 s, completed Jul 20, 2018
You can add the task as a setting of the project you want :
lazy val uploadTask = {
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
}
project.in(`.` / "project1).settings(uploadTask)

How to write a plugin which depends on a task from another plugin?

There is a great sbt plugin sbt-dependency-graph, which provides a dependencyTree task to show the dependencies.
I want to write a sbt plugin which depends on it, but always fails.
build.sbt
sbtPlugin := true
name := "my-sbt-plugin-depends-on-another"
version := "0.1.2.1"
organization := "test20140913"
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
src/main/scala/MySbtPlugin.scala
import sbt._
object MySbtPlugin extends AutoPlugin {
object autoImport {
lazy val hello = taskKey[Unit]("hello task from my plugin")
lazy val hello2 = taskKey[Unit]("hello task from my plugin2")
}
import autoImport._
override def trigger = allRequirements
override def requires = plugins.JvmPlugin
val helloSetting = hello := println("Hello from my plugin")
val helloSetting2 = hello2 := {
println("hello2, task result from another plugins:")
println(net.virtualvoid.sbt.graph.Plugin.dependencyTree.value)
println("=========================================")
}
override def projectSettings = Seq(
helloSetting, helloSetting2
)
}
Then I published it to local, and use it in another project:
build.sbt
name := "sbt--plugin-test"
version := "1.0"
scalaVersion := "2.11.6"
net.virtualvoid.sbt.graph.Plugin.graphSettings
project/plugins.scala
logLevel := Level.Info
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
addSbtPlugin("test20140913" % "my-sbt-plugin-depends-on-another" % "0.1.2.1")
When I run sbt on the later project, it reports:
Reference to undefined setting:
*:dependencyTree from *:hello2 (/Users/twer/workspace/my-sbt-plugin-depends-on-another/src/main/scala/test20140913/MySbtPlugin.scala:38)
Did you mean provided:dependencyTree ?
at sbt.Init$class.Uninitialized(Settings.scala:262)
at sbt.Def$.Uninitialized(Def.scala:10)
at sbt.Init$class.delegate(Settings.scala:188)
at sbt.Def$.delegate(Def.scala:10)
Where is wrong?
PS: The plugin code is here: https://github.com/freewind/my-sbt-plugin-depends-on-another
dependencyTree is only defined for specific configurations (well all of them), but it automatically delegates to Compile in the shell.
Try defining hello2 like so:
val helloSetting2 = hello2 := {
println("hello2, task result from another plugins:")
import net.virtualvoid.sbt.graph.Plugin.dependencyTree
println((dependencyTree in Compile).value)
println("=========================================")
}

SBT plugin - User defined configuration for Command via their build.sbt

I'm writing an SBT Plugin that adds a Command and would like users to be able to configure this Command by setting variables in their build.sbt. What is the simplest way to achieve this?
Here is an simplified example of what the Plugin looks like:
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
override lazy val settings = Seq(commands += Command.args("mycommand", "myarg")(myCommand))
def myCommand = (state: State, args: Seq[String]) => {
//Logic for command...
state
}
}
I would like someone to be able to add the follow to their build.sbt file:
newSetting := "light"
How do I make this available as a String variable from inside the myCommand Command above?
Take a look at the example here: http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
In this example, a task and setting are defined:
val newTask = TaskKey[Unit]("new-task")
val newSetting = SettingKey[String]("new-setting")
val newSettings = Seq(
newSetting := "test",
newTask <<= newSetting map { str => println(str) }
)
A user of your plugin could then provide their own value for the newSetting setting in their build.sbt:
newSetting := "light"
EDIT
Here's another example, closer to what you're going for:
Build.scala:
import sbt._
import Keys._
object HelloBuild extends Build {
val newSetting = SettingKey[String]("new-setting", "a new setting!")
val myTask = TaskKey[State]("my-task")
val mySettings = Seq(
newSetting := "default",
myTask <<= (state, newSetting) map { (state, newSetting) =>
println("newSetting: " + newSetting)
state
}
)
lazy val root =
Project(id = "hello",
base = file("."),
settings = Project.defaultSettings ++ mySettings)
}
With this configuration, you can run my-task at the sbt prompt, and you'll see newSetting: default printed to the console.
You can override this setting in build.sbt:
newSetting := "modified"
Now, when you run my-task at the sbt prompt, you'll see newSetting: modified printed to the console.
EDIT 2
Here's a stand-alone version of the example above: https://earldouglas.com/ext/stackoverflow.com/questions/17038663/
I've accepted #James's answer as it really helped me out. I moved away from using a Commands in favour of a Task (see this mailing list thread). In the end my plugin looked something like this:
package packge.to.my.plugin
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
import MyKeys._
object MyKeys {
val myTask = TaskKey[Unit]("runme", "This means you can run 'runme' in the SBT console")
val newSetting = SettingKey[String]("newSetting")
}
override lazy val settings = Seq (
newSetting := "light",
myTask <<= (state, newSetting) map myCommand
)
def myCommand(state: State, newSetting: String) {
//This code runs when the user types the "runme" command in the SBT console
//newSetting is "light" here unless the user overrides in their build.sbt (see below)
state.log.info(newSetting)
}
}
To override the newSetting in the build.sbt of a project that uses this plugin:
import packge.to.my.plugin.MyKeys._
newSetting := "Something else"
The missing import statement had me stuck for a while!

How to write a Build.scala to change a SettingKey priviously assigned in build.sbt

I have a build.sbt:
name := "name"
And a project/Build.scala:
import sbt._
object MyBuild extends Build {
val root = Project(id = "root", base = file("."))
override def settings = super.settings :+ (
Keys.name in root ~= { oldName => oldName + "-in-scala" }
)
}
I want a transformer in project/Build.scala, which can changes name to name-in-scala. But it does not work.
How can I write a transformer in Build.scala?
I don't think that's possible.
The page
http://www.scala-sbt.org/release/docs/Getting-Started/Full-Def.html#relating-build-sbt-to-build-scala states about SBT 0.12.1:
The setting in build.sbt should "win" over the one in Build.scala.
and
The settings in .sbt files are appended to the settings in .scala files.