combining input task with dynamic task in sbt - scala

I'd like to make an input task that proceed user input and generate bunch of subtasks to run. Here is the example:
import sbt._
import Keys._
import Def.Initialize
import complete.DefaultParsers._
object TestBuild extends Build {
val sampleInput = inputKey[Seq[String]]("sample dynamic input task")
val sampleDynamic = taskKey[Seq[String]]("sample dynamic task")
override lazy val settings = super.settings ++ Seq(
sampleDynamic := Def.taskDyn {
val sources = Seq("ab", "csd", "efda")
sources.map(sampleTaskFor _).joinWith(_.join)
}.value,
sampleInput := Def.inputTaskDyn {
val sources = spaceDelimited("<arg>").parsed
sources.map(sampleTaskFor _).joinWith(_.join)
}.value
)
private def sampleTaskFor(source : String) : Initialize[Task[String]] = Def.task {
source + " : " + source
}
}
There are two samples. The first works and shows simple taskDyn with predefined input. The second is intended dynamic task with user input that refuses to compile with error that I can not interpret
[error] home/project/build.scala:15: Illegal dynamic reference: Def
[error] sampleInput := Def.inputTaskDyn {
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
How can I avoid it?
Trial and error log
There I would append my question with different proposed changes that still can not solve the issue
replace InputTask.value with InputTask.evaluated
sources.map(sampleTaskFor _).joinWith(_.join)
- }.value
+ }.evaluated
)
If I would able to define correct InputTask it should be accessed via evaluated method, as I've found both in documentation and in practice after trying to play with different InputTasks that compiles.
Still that would not fix the issue that sbt macro engine refuses provided inputTaskDyn.
waiting for other suggestions

Trial and error method gave me the answer but non a single bit of understanding. Here it is:
import sbt._
import Keys._
import Def.Initialize
import complete.DefaultParsers._
object TestBuild extends Build {
val sampleInput = inputKey[Seq[String]]("sample dynamic input task")
val sampleDynamic = taskKey[Seq[String]]("sample dynamic task")
override lazy val settings = super.settings ++ Seq(
sampleDynamic := Def.taskDyn {
val sources = Seq("ab", "csd", "efda")
sources.map(sampleTaskFor _).joinWith(_.join)
}.value,
sampleInput := Def.inputTaskDyn {
val sources = spaceDelimited("<arg>").parsed
sampleTaskAll(sources)
}.evaluated
)
private def sampleTaskFor(source : String) : Initialize[Task[String]] = Def.task {
source + " : " + source
}
private def sampleTaskAll(sources : Seq[String]) : Initialize[Task[Seq[String]]] = Def.taskDyn {
sources.map(sampleTaskFor _).joinWith(_.join)
}
}
For a reason I can not comprehend you should isolate creating multitask out of sequence of single tasks in a separate method.

Related

Setting cannot depend on a task when adding logging to setting key

I am updating an sbt plugin which has a SettingKey from the fm-sbt-s3-resolver. I have made some progress on explicitly adding the needed setting as a side effect of the question here:
Logging from an sbt plugin
object MyPlugin {
override def requires = S3ResolverPlugin
override def trigger = allRequirements
override lazy val globalSettings = Seq(
resolvers ++= repos,
publishMavenStyle := true,
S3ResolverPlugin.autoImport.s3CredentialsProvider := {bucket: String =>
new AWSCredentialsProviderChain(
new EnvironmentVariableCredentialsProvider(),
PropertyFilesCredentialProvider.create(bucket, streams.value.log)
)
}
)
}
When I try to add logging using streams.value.log, sbt throws an error:
[error] A setting cannot depend on a task
[error] PropertyFilesCredentialProvider.create(bucket, streams.value.log)
In this case, you need to depend on setting sLog instead of task streams.
import sbt._
import sbt.Keys._
object Logs extends AutoPlugin {
object autoImport {
val settingLog = settingKey[Unit]("Uses setting sLog")
val taskLog = taskKey[Unit]("Uses task streams log")
}
import autoImport._
override def trigger = allRequirements
override def projectSettings: Seq[Def.Setting[_]] = Seq(
settingLog := { sLog.value.info("Logging on settings execution") },
taskLog := { streams.value.log.info("Logging on task execution") }
)
}
The difference with settingLog is the it will log when sbt boots.
~/w/t/stackoverflow $ sbt
[info] Logging on settings execution
Repeated calls to settingLog will not produce any logging anymore. But taskLog will print a log every time you call it

How to get the dependencies of all sub-projects from a SBT project, in a SBT task?

I'm write a SBT task, which can output the dependencies information, grouped by project (say a SBT project has multi projects)
I know there is a sbt-dependency-graph plugin, but I can use it directly, because I want to generate a json file, but that plugin just output the dependency tree to console, without returning an data object, I can't easily get the data I want.
I found the update task returns a UpdateReport which contains a lot of information I want, but it only belong to the current project. In command line, if I want to know the information of all project, I can manually show all the projects by projects command, and view them one by one by someproject/update.
But how to do the same in a SBT task? I tried:
val reports = projects.toList.map(prj => (update in prj).value)
It reports:
[error] /Users/me/workspace/sbt-test/project/Build.scala:51: Illegal dynamic reference: prj
[error] val reports = projects.toList.map(prj => (update in prj).value)
[error] ^
[error] one error found
How to fix it?
More code:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val allUpdate = taskKey[Unit]("show update reports of all projects")
lazy val core = project
lazy val web = project
lazy val allUpdateDef = allUpdate := {
val reports = projects.toList.map(prj => (update in prj).value)
println(reports)
}
lazy val root = (project in file("."))
.settings(
allUpdateDef
)
}
After checking the document: http://www.scala-sbt.org/0.13/docs/Tasks.html, I found the solution:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val groupByProject: Def.Initialize[Task[(String, UpdateReport)]] =
Def.task {
(thisProject.value.id, (update in thisProject).value)
}
lazy val filter = ScopeFilter(inAnyProject, inAnyConfiguration)
updateByProject := {
val subProjects = groupByProject.all(filter).value.map { case ( projectName, updateReport) =>
...
}
}
}

sbt runs IMain and play makes errors

I've build a litte object, which can interpret scala code on the fly and catches a value out of it.
object Interpreter {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
class Dummy
val settings = new Settings
settings.usejavacp.value = false
settings.embeddedDefaults[Dummy] // to make imain useable with sbt.
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring{
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
object Main {
def main(args: Array[String]) {
println(Interpreter.run("val x = 1", "x"))
}
}
In a pure sbt environment or called by the scala interpreter this code works fine! But if I run this in a simple play (version 2.2.2) application, it gets a null pointer at val ret = this.imain.valueOfTerm(returnId).
play uses also a modified sbt, therefor it should probably work. What does play do that this code doesn't work anymore? Any ideas how to get this code to work in play?
Note
That's the used build.sbt:
name := "Test"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
Alternatively I tried this implementation, but it doesen't solve the problem either:
object Interpreter2 {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
import play.api._
import play.api.Play.current
val settings: Settings = {
lazy val urls = java.lang.Thread.currentThread.getContextClassLoader match {
case cl: java.net.URLClassLoader => cl.getURLs.toList
case _ => sys.error("classloader is not a URLClassLoader")
}
lazy val classpath = urls map {_.toString}
val tmp = new Settings()
tmp.bootclasspath.value = classpath.distinct mkString java.io.File.pathSeparator
tmp
}
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring {
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
Useful links I found to make this second implementation:
scala.tools.nsc.IMain within Play 2.1
How to set up classpath for the Scala interpreter in a managed environment?
https://groups.google.com/forum/#!topic/scala-user/wV86VwnKaVk
https://github.com/gourlaysama/play-repl-example/blob/master/app/REPL.scala#L18
https://gist.github.com/mslinn/7205854
After spending a few hours on this issue myself, here is a solution that I came up with. It works both inside SBT and outside. It is also expected to work in a variety of managed environments (like OSGi):
private def getClasspathUrls(classLoader: ClassLoader, acc: List[URL]): List[URL] = {
classLoader match {
case null => acc
case cl: java.net.URLClassLoader => getClasspathUrls(classLoader.getParent, acc ++ cl.getURLs.toList)
case c => LOGGER.error("classloader is not a URLClassLoader and will be skipped. ClassLoader type that was skipped is " + c.getClass)
getClasspathUrls(classLoader.getParent, acc)
}
}
val classpathUrls = getClasspathUrls(this.getClass.getClassLoader, List())
val classpathElements = classpathUrls map {url => url.toURI.getPath}
val classpath = classpathElements mkString java.io.File.pathSeparator
val settings = new Settings
settings.bootclasspath.value = classpath
val imain = new IMain(settings)
// use imain to interpret code. It should be able to access all your application classes as well as dependent libraries.
It's because play uses the "fork in run" feature from sbt. This feature starts a new JVM and this causes that this failure appears:
[info] Failed to initialize compiler: object scala.runtime in compiler mirror not found.
[info] ** Note that as of 2.8 scala does not assume use of the java classpath.
[info] ** For the old behavior pass -usejavacp to scala, or if using a Settings
[info] ** object programatically, settings.usejavacp.value = true.
See: http://www.scala-sbt.org/release/docs/Detailed-Topics/Forking

SBT plugin - User defined configuration for Command via their build.sbt

I'm writing an SBT Plugin that adds a Command and would like users to be able to configure this Command by setting variables in their build.sbt. What is the simplest way to achieve this?
Here is an simplified example of what the Plugin looks like:
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
override lazy val settings = Seq(commands += Command.args("mycommand", "myarg")(myCommand))
def myCommand = (state: State, args: Seq[String]) => {
//Logic for command...
state
}
}
I would like someone to be able to add the follow to their build.sbt file:
newSetting := "light"
How do I make this available as a String variable from inside the myCommand Command above?
Take a look at the example here: http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
In this example, a task and setting are defined:
val newTask = TaskKey[Unit]("new-task")
val newSetting = SettingKey[String]("new-setting")
val newSettings = Seq(
newSetting := "test",
newTask <<= newSetting map { str => println(str) }
)
A user of your plugin could then provide their own value for the newSetting setting in their build.sbt:
newSetting := "light"
EDIT
Here's another example, closer to what you're going for:
Build.scala:
import sbt._
import Keys._
object HelloBuild extends Build {
val newSetting = SettingKey[String]("new-setting", "a new setting!")
val myTask = TaskKey[State]("my-task")
val mySettings = Seq(
newSetting := "default",
myTask <<= (state, newSetting) map { (state, newSetting) =>
println("newSetting: " + newSetting)
state
}
)
lazy val root =
Project(id = "hello",
base = file("."),
settings = Project.defaultSettings ++ mySettings)
}
With this configuration, you can run my-task at the sbt prompt, and you'll see newSetting: default printed to the console.
You can override this setting in build.sbt:
newSetting := "modified"
Now, when you run my-task at the sbt prompt, you'll see newSetting: modified printed to the console.
EDIT 2
Here's a stand-alone version of the example above: https://earldouglas.com/ext/stackoverflow.com/questions/17038663/
I've accepted #James's answer as it really helped me out. I moved away from using a Commands in favour of a Task (see this mailing list thread). In the end my plugin looked something like this:
package packge.to.my.plugin
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
import MyKeys._
object MyKeys {
val myTask = TaskKey[Unit]("runme", "This means you can run 'runme' in the SBT console")
val newSetting = SettingKey[String]("newSetting")
}
override lazy val settings = Seq (
newSetting := "light",
myTask <<= (state, newSetting) map myCommand
)
def myCommand(state: State, newSetting: String) {
//This code runs when the user types the "runme" command in the SBT console
//newSetting is "light" here unless the user overrides in their build.sbt (see below)
state.log.info(newSetting)
}
}
To override the newSetting in the build.sbt of a project that uses this plugin:
import packge.to.my.plugin.MyKeys._
newSetting := "Something else"
The missing import statement had me stuck for a while!

Getting an SBT task to depend on the OneJar task

I'm having trouble getting my new SBT task 'install' to depend on the OneJar task. Here's my Build.scala file:
import sbt._
import Keys._
import com.github.retronym.SbtOneJar._
object BuildBroBuild extends Build {
val install = TaskKey[Unit]("install", "Installs the JAR and a launcher script into your homedir")
private def installTask = task {
println("Hello world!")
}
override lazy val settings = super.settings ++
Seq(install <<= (oneJar in Global)(installTask dependsOn(_)))
lazy val root = Project(id = "buildbro",
base = file("."),
settings = Project.defaultSettings)
}
And here's the error I'm getting:
[error] Reference to undefined setting:
[error]
[error] */*:one-jar from {.}/*:install
[error] Did you mean *:one-jar ?
[error]
Does anybody know what this means? I believe I have to scope the oneJar TaskKey in a different way. Thanks for any help you can offer.
I think something like this should work:
object BuildBroBuild extends Build {
val install = TaskKey[Unit]("install", "Installs the JAR and a launcher script into your homedir")
private lazy val installTask = install <<= (oneJar, streams) map { case (a, s) => {
// 'a' is the output from the onejar task (ie, the artifact)
println("Hello world!")
}
override lazy val settings = super.settings ++
Seq(installTask)
lazy val root = Project(id = "buildbro",
base = file("."),
settings = Project.defaultSettings)
}
Here, we are taking the output of the oneJar task (as well as streams, which allows for logging, etc) as input for our new task.