Write Tasks in dedicated file, when writing SBT AutoPlugins - scala

I have written an SBT auto plugin MyPlugin.scala:
package com.abhi
import sbt._
import sbt.Keys._
object MyPlugin extends AutoPlugin {
object autoImport {
val helloTask = taskKey[Unit]("says hello")
val byeTask = taskKey[Unit]("bye task")
}
import autoImport._
override lazy val projectSettings = Seq(
helloTask := {
val dir = sourceManaged.value
val cp = (dependencyClasspath in Compile).value
val r = (runner in Compile).value
val s = streams.value
val rd = (resourceDirectory in Compile).value
val sd = (sourceDirectory in Compile).value
println(s"Here to say hello $dir $cp $r $s $rd $sd")
},
byeTask := {
val dir = sourceManaged.value
val cp = (dependencyClasspath in Compile).value
val r = (runner in Compile).value
val s = streams.value
val rd = (resourceDirectory in Compile).value
val sd = (sourceDirectory in Compile).value
println(s"Here to say bye $dir $cp $r $s $rd $sd")
}
)
}
This works and I am able to use this plugin. However the implementation of helloTask and byeTask will be a little long, so I don't want to write the implementation inside MyPlugin.scala.
Instead, I want to create two separate files HelloTask.scala and ByeTask.scala and then write the respective implementations there.
I looked at the SBT documentation for Custom Settings and all examples always implement the tasks inside of the plugin itself.
How can I write the implementation of helloTask and byeTask outside of the MyPlugin.scala file? also how to share some logic between HelloTask and ByeTask.
The following lines are common between the two tasks and I want to write these only once
val dir = sourceManaged.value
val cp = (dependencyClasspath in Compile).value
val r = (runner in Compile).value
val s = streams.value
val rd = (resourceDirectory in Compile).value
val sd = (sourceDirectory in Compile).value

Separating tasks implementation is a common good practice. It is mentioned in the Tasks documentation. You can write an implementation using Def.task macro:
def taskImpl(args: ...): Def.Initialize[Task[...]] = Def.task {
...
}
And then use it with different arguments to set different task keys:
override def projectSettings = Seq(
taskA := taskImpl("A").value,
taskB := taskImpl("B").value,
)
In your case you could do something like this:
def saySmthImpl(msg: String): Def.Initialize[Task[Unit]] = Def.task {
val dir = sourceManaged.value
val cp = (dependencyClasspath in Compile).value
val r = (runner in Compile).value
val s = streams.value
val rd = (resourceDirectory in Compile).value
val sd = (sourceDirectory in Compile).value
println(s"$msg $dir $cp $r $s $rd $sd")
}
You can keep this implementation in a separate file if you want. The in the plugin definition you can use it like this:
override def projectSettings = Seq(
helloTask := saySmthImpl("Here to say hello").value,
byeTask := saySmthImpl("Here to say bye").value,
)
You should keep in mind though, that accessing other settings or tasks with .value can be done only in certain scopes, like that Def.task or Def.setting or when setting keys with :=. This limits (or rather directs) the ways you can share logic between different tasks implementations.

Related

SBT: generate code for submodule before compilation

I have the following issue with build.sbt configuration.
I need to generate some code before compilation.
That's how it works now.
lazy val rootProject = project.in(file("."))
lazy val rootSourceGenerator = Def.task {
val f: File = (sourceManaged in Compile).value / "com" / "myproject" / "Version.scala"
IO.write(
f,
s"""package com.myproject
|
|object Version {
| some code ...
|}
|""".stripMargin
)
Seq(f)
}
inConfig(Compile)(
Seq(
sourceGenerators += rootSourceGenerator
))
And for now I need to make the same thing for a new submodule.
lazy val rootProject = project.in(file(".")).dependsOn(submodule)
lazy val submodule = project.in(file("submodule"))
lazy val submoduleSourceGenerator = Def.task {
val f: File = (sourceManaged in (submodule, Compile)).value / "com" / "myproject" / "SubmoduleVersion.scala"
IO.write(
f,
s"""package com.myproject
|
|object SubmoduleVersion {
| some code ...
|}
|""".stripMargin
)
Seq(f)
}
inConfig(submodule / Compile)(
Seq(
sourceGenerators += submoduleSourceGenerator
))
And inConfig(submodule / Compile) doesn't work. Error is about unknown syntax for /.
Any suggestions how to fix this?
There are multiple solutions but the cleanest is following in my opinion.
Create an AutoPlugin in project/GenerateVersion.scala with following content
import sbt.Keys._
import sbt._
object GenerateVersion extends AutoPlugin {
override def trigger = noTrigger
override def projectSettings: Seq[Def.Setting[_]] = {
Seq(
sourceGenerators in Compile += Def.task {
val f: File =
(sourceManaged in Compile).value / "com" / "myproject" / "Version.scala"
IO.write(
f,
s"""package com.myproject
|
|object Version {
|}
|""".stripMargin
)
Seq(f)
}.taskValue
)
}
}
Enable newly created plugin GenerateVersion for all projects/submodules that need Version.scala generated. It can be done as following in build.sbt
lazy val sub = project
.in(file("sub"))
.enablePlugins(GenerateVersion)
lazy val root = project
.in(file("."))
.enablePlugins(GenerateVersion)
.aggregate(sub)
aggregate(sub) is added to run tasks in sub module when root tasks are triggered. For example, sbt compile will run both sbt "root/compile" "sub/compile"
This solution is easier to share across multiple SBT projects in the form of SBT plugin.
Also, you might be interested in sbt-builtinfo plugin
Thanks, Ivan Stanislavciuc! But I've found another solution.
Just add all of the following content to /subproject/build.sbt
lazy val submoduleSourceGenerator = Def.task {
val f: File = (sourceManaged in Compile).value / "com" / "myproject" / "SubmoduleVersion.scala"
IO.write(
f,
s"""package com.myproject
|
|object SubmoduleVersion {
| some code ...
|}
|""".stripMargin
)
Seq(f)
}
inConfig(Compile)(
Seq(
sourceGenerators += submoduleSourceGenerator
))

Value lookup resolves to a wrong scope in the sbt plugin

I'm trying to write a plugin for sbt for my project that will process resources. In a nutshell, it is maven profiles made in sbt. When I inspect prod:dictionary I get expected state of this Map, however, when I try prod:expandParameters I get an empty Map. How could I get the value of the dictionary from the scope of the exact configuration that command is run with?
project/ResourceFiltering.scala
import sbt._
import sbt.Keys._
import sbt.internal.util.ManagedLogger
import scala.util.matching.Regex
object ResourceFiltering extends AutoPlugin {
override def trigger = AllRequirements
sealed trait Keys {
lazy val expandParameters = taskKey[Unit]("")
lazy val extensions = settingKey[Seq[String]]("")
lazy val pattern = settingKey[Regex]("")
lazy val dictionary = settingKey[Map[String, String]]("")
}
object autoImport extends Keys
import autoImport._
override val projectSettings: Seq[Def.Setting[_]] = Seq(
Zero / extensions := Seq("conf", "properties", "xml"),
Zero / pattern := """(\$\{()\})""".r,
Zero / dictionary := Map.empty,
expandParameters := {
val log: ManagedLogger = streams.value.log
log.info(s"""|Parameter expansion
|Configuration: $configuration
|Extensions: ${extensions value}
|Pattern: ${pattern value}
|Dictionary: ${dictionary value}
""".stripMargin)
}
)
}
build.sbt
enablePlugins(ResourceFiltering)
lazy val Prod = config("prod") extend Compile describedAs "Scope to build production packages."
lazy val Stage = config("stage") extend Compile describedAs "Scope to build stage packages."
lazy val Local = config("local") extend Compile describedAs "Scope to build local packages."
lazy val root = (project in file("."))
.configs(Prod, Stage, Local)
.settings(sharedSettings)
lazy val sharedSettings =
prodSettings ++ stageSettings ++ localSettings
lazy val defaults = Defaults.configSettings ++ Defaults.configTasks ++ Defaults.resourceConfigPaths
lazy val prodSettings = inConfig(Prod)(defaults ++ Seq(
dictionary ++= Profiles.prod
))
lazy val stageSettings = inConfig(Stage)(defaults ++ Seq(
dictionary ++= Profiles.stage
))
lazy val localSettings = inConfig(Local)(defaults ++ Seq(
dictionary ++= Profiles.local
))
project/Profiles.scala
lazy val default: Map[String, String] = local
lazy val local: Map[String, String] = Map("example" -> "local")
lazy val stage: Map[String, String] = Map("example" -> "stage")
lazy val prod: Map[String, String] = Map("example" -> "prod")
Analysing Plugins Best Practices docs I would make the following recommendations regarding configuration and scoping.
Provide default values in globalSettings instead of projectSettings like so
override lazy val globalSettings = Seq(
dictionary := Map.empty
)
Next collect base configuration of expandParameters into its own sequence like so
lazy val baseResourceFilteringSettings: Seq[Def.Setting[_]] = Seq(
extensions := Seq("conf", "properties", "xml"),
pattern := """(\$\{()\})""".r,
expandParameters := {
val log: ManagedLogger = streams.value.log
log.info(
s"""|Parameter expansion
|Configuration: $configuration
|Extensions: ${extensions value}
|Pattern: ${pattern value}
|Dictionary: ${dictionary value}
""".stripMargin
)
}
)
Note how dictionary is not initialised in baseResourceFilteringSettings, instead by default it will come from globalSettings.
Now we have taken care of our defaults and we have our base configuration, so we can proceed to "specialise" it by configuration scope using inConfig like so
lazy val localSettings = inConfig(Local)(defaults ++ Seq(
dictionary ++= Profiles.local
) ++ baseResourceFilteringSettings)
Note how we have scoped baseResourceFilteringSettings to Local config, as well as dictionary ++= Profiles.local.
Now executing ;reload;local:expandParameters should output
[info] Parameter expansion
[info] Configuration: SettingKey(This / This / This / configuration)
[info] Extensions: List(conf, properties, xml)
[info] Pattern: (\$\{()\})
[info] Dictionary: Map(example -> local)
where we see Dictionary: Map(example -> local) as required.
Here is the complete code of ResourceFiltering
object ResourceFiltering extends AutoPlugin {
override def trigger = AllRequirements
sealed trait Keys {
lazy val expandParameters = taskKey[Unit]("")
lazy val extensions = settingKey[Seq[String]]("")
lazy val pattern = settingKey[Regex]("")
lazy val dictionary = settingKey[Map[String, String]]("")
lazy val baseResourceFilteringSettings: Seq[Def.Setting[_]] = Seq(
extensions := Seq("conf", "properties", "xml"),
pattern := """(\$\{()\})""".r,
expandParameters := {
val log: ManagedLogger = streams.value.log
log.info(
s"""|Parameter expansion
|Configuration: $configuration
|Extensions: ${extensions value}
|Pattern: ${pattern value}
|Dictionary: ${dictionary value}
""".stripMargin
)
}
)
}
object autoImport extends Keys
import autoImport._
override lazy val globalSettings = Seq(
dictionary := Map.empty
)
}
Also consider moving configuration definitions into plugin like so
object ResourceFiltering extends AutoPlugin {
override def trigger = AllRequirements
sealed trait Keys {
lazy val Prod = config("prod") extend Compile describedAs "Scope to build production packages."
lazy val Stage = config("stage") extend Compile describedAs "Scope to build stage packages."
lazy val Local = config("local") extend Compile describedAs "Scope to build local packages."
lazy val expandParameters = taskKey[Unit]("")
lazy val extensions = settingKey[Seq[String]]("")
lazy val pattern = settingKey[Regex]("")
lazy val dictionary = settingKey[Map[String, String]]("")
lazy val baseResourceFilteringSettings: Seq[Def.Setting[_]] = Seq(
extensions := Seq("conf", "properties", "xml"),
pattern := """(\$\{()\})""".r,
expandParameters := {
val log: ManagedLogger = streams.value.log
log.info(
s"""|Parameter expansion
|Configuration: $configuration
|Extensions: ${extensions value}
|Pattern: ${pattern value}
|Dictionary: ${dictionary value}
""".stripMargin
)
}
)
}
object autoImport extends Keys
import autoImport._
override lazy val globalSettings = Seq(
dictionary := Map.empty
)
override val projectSettings: Seq[Def.Setting[_]] =
inConfig(Stage)(baseResourceFilteringSettings) ++
inConfig(Prod)(baseResourceFilteringSettings) ++
inConfig(Local)(baseResourceFilteringSettings)
}
This way we do not have to remember to add baseResourceFilteringSettings to config scope and can simply write
lazy val localSettings = inConfig(Local)(defaults ++ Seq(
dictionary ++= Profiles.local
)

How can I override tasks ``run`` and ``runMain`` in SBT to use my own ``ForkOptions``?

Problem
In a multimodule build, each module has it's own baseDirectory but I would like to launch applications defined in modules employing the baseDirectory of the root project instead of the baseDirectory relative to modules involved.
This way, applications always would take relative file names from the root folder, which is a very common pattern.
The problem is that ForkOptions enforces the baseDirectory from the module and apparently there's no easy way to change that because forkOptions is private. I would like to pass a forkOptions populated with the baseDirectory from the root project instead.
Besides, there are modules which contain two or more applications. So, I'd like to have separate configurations for each application in a given module which contains two or more applications.
An example tells more than 1000 words:
build.sbt
import sbt._
import Keys._
lazy val buildSettings: Seq[Setting[_]] = Defaults.defaultSettings
lazy val forkRunOptions: Seq[Setting[_]] = Seq(fork := true)
addCommandAlias("r1", "ModuleA/RunnerR1:run")
addCommandAlias("r2", "ModuleA/RunnerR2:run")
lazy val RunnerR1 = sbt.config("RunnerR1").extend(Compile)
lazy val RunnerR2 = sbt.config("RunnerR2").extend(Compile)
lazy val root =
project
.in(file("."))
.settings(buildSettings:_*)
.aggregate(ModuleA)
lazy val ModuleA =
project
.in(file("ModuleA"))
.settings(buildSettings:_*)
.configs(RunnerR1,RunnerR2)
.settings(inConfig(RunnerR1)(
forkRunOptions ++
Seq(
mainClass in Compile := Option("sbt.tests.issueX.Application1"))):_*)
.settings(inConfig(RunnerR2)(
forkRunOptions ++
Seq(
mainClass in Compile := Option("sbt.tests.issueX.Application2"))):_*)
In SBT console, I would expect this:
> r1
This is Application1
> r2
This is Application2
But I see this:
> r1
This is Application2
> r2
This is Application2
What is the catch?
Not only that... SBT is running applications in process. It's not forking them. Why fork := true is not taking any effect?
Explanation
see: https://github.com/frgomes/sbt-issue-2247
Turns out that configurations do not work the way one might think they work.
The problem is that, in the snippet below, configuration RunnerR1 does not inherit tasks from module ModuleA as you might expect. So, when you type r1 or r2 (i.e: ModuleA/RunnerR1:run or ModuleA/RunnerR2:run), SBT will employ the delegaton algorithm in order to find tasks and settings which, depending on how these tasks and settings were defined, it will end up running tasks from scopes you do not expect, or finding settings from scopes you do not expect.
lazy val ModuleA =
project
.in(file("ModuleA"))
.settings(buildSettings:_*)
.configs(RunnerR1,RunnerR2)
.settings(inConfig(RunnerR1)(
forkRunOptions ++
Seq(
mainClass in Compile := Option("sbt.tests.issueX.Application1"))):_*)
This issue is related to usability, since the API provided by SBT is misleading. Eventually this pattern can be improved or better documented, but it's more a usability problem than anything else.
Circumventing the difficulty
Please find below how this issue can be circumvented.
Since ForkOptions is private, we have to provide our own way of running applications, which is based on SBT code, as much as possible.
In a nutshell, we have to guarantee that we redefine run, runMain and runner in all configurations we have.
import sbt._
import Keys._
//-------------------------------------------------------------
// This file contains a solution for the problem presented by
// https://github.com/sbt/sbt/issues/2247
//-------------------------------------------------------------
lazy val buildSettings: Seq[Setting[_]] = Defaults.defaultSettings ++ runSettings
lazy val runSettings: Seq[Setting[_]] =
Seq(
fork in (Compile, run) := true)
def forkRunOptions(s: Scope): Seq[Setting[_]] =
Seq(
// see: https://github.com/sbt/sbt/issues/2247
// see: https://github.com/sbt/sbt/issues/2244
runner in run in s := {
val forkOptions: ForkOptions =
ForkOptions(
workingDirectory = Some((baseDirectory in ThisBuild).value),
bootJars = Nil,
javaHome = (javaHome in s).value,
connectInput = (connectInput in s).value,
outputStrategy = (outputStrategy in s).value,
runJVMOptions = (javaOptions in s).value,
envVars = (envVars in s).value)
new {
val fork_ = (fork in run).value
val config: ForkOptions = forkOptions
} with ScalaRun {
override def run(mainClass: String, classpath: Seq[File], options: Seq[String], log: Logger): Option[String] =
javaRunner(
Option(mainClass), Option(classpath), options,
Some("java"), Option(log), fork_,
config.runJVMOptions, config.javaHome, config.workingDirectory, config.envVars, config.connectInput, config.outputStrategy)
}
},
runner in runMain in (s) := (runner in run in (s)).value,
run in (s) <<= Defaults.runTask (fullClasspath in s, mainClass in run in s, runner in run in s),
runMain in (s) <<= Defaults.runMainTask(fullClasspath in s, runner in runMain in s)
)
def javaRunner(mainClass: Option[String] = None,
classpath: Option[Seq[File]] = None,
options: Seq[String],
javaTool: Option[String] = None,
log: Option[Logger] = None,
fork: Boolean = false,
jvmOptions: Seq[String] = Nil,
javaHome: Option[File] = None,
cwd: Option[File] = None,
envVars: Map[String, String] = Map.empty,
connectInput: Boolean = false,
outputStrategy: Option[OutputStrategy] = Some(StdoutOutput)): Option[String] = {
def runner(app: String,
args: Seq[String],
cwd: Option[File] = None,
env: Map[String, String] = Map.empty): Int = {
import scala.collection.JavaConverters._
val cmd: Seq[String] = app +: args
val pb = new java.lang.ProcessBuilder(cmd.asJava)
if (cwd.isDefined) pb.directory(cwd.get)
pb.inheritIO
//FIXME: set environment
val process = pb.start()
if (fork) 0
else {
def cancel() = {
if(log.isDefined) log.get.warn("Background process cancelled.")
process.destroy()
15
}
try process.waitFor catch {
case e: InterruptedException => cancel()
}
}
}
val app: String = javaHome.fold("") { p => p.absolutePath + "/bin/" } + javaTool.getOrElse("java")
val jvm: Seq[String] = jvmOptions.map(p => p.toString)
val cp: Seq[String] =
classpath
.fold(Seq.empty[String]) { paths =>
Seq(
"-cp",
paths
.map(p => p.absolutePath)
.mkString(java.io.File.pathSeparator))
}
val klass = mainClass.fold(Seq.empty[String]) { name => Seq(name) }
val xargs: Seq[String] = jvm ++ cp ++ klass ++ options
if(log.isDefined)
if(fork) {
log.get.info(s"Forking: ${app} " + xargs.mkString(" "))
} else {
log.get.info(s"Running: ${app} " + xargs.mkString(" "))
}
if (cwd.isDefined) IO.createDirectory(cwd.get)
val exitCode = runner(app, xargs, cwd, envVars)
if (exitCode == 0)
None
else
Some("Nonzero exit code returned from " + app + ": " + exitCode)
}
addCommandAlias("r1", "ModuleA/RunnerR1:run")
addCommandAlias("r2", "ModuleA/RunnerR2:run")
lazy val RunnerR1 = sbt.config("RunnerR1").extend(Compile)
lazy val RunnerR2 = sbt.config("RunnerR2").extend(Compile)
lazy val root =
project
.in(file("."))
.settings(buildSettings:_*)
.aggregate(ModuleA)
lazy val ModuleA =
project
.in(file("ModuleA"))
.settings(buildSettings:_*)
.configs(RunnerR1,RunnerR2)
.settings(inConfig(RunnerR1)(
forkRunOptions(ThisScope) ++
Seq(
mainClass := Option("sbt.tests.issueX.Application1"))):_*)
.settings(inConfig(RunnerR2)(
forkRunOptions(ThisScope) ++
Seq(
mainClass := Option("sbt.tests.issueX.Application2"))):_*)

SBT: input task to generate a test source by name

I want to create an sbt task to generate a test source e.g. sbt genSpec Foo should generate FooSpec.scala in src_managed/test
I tried this:
val genSpec = inputKey[File]("Generate spec file")
genSpec := {
import sbt.complete.DefaultParsers._
val log = streams.value.log
val arg: String = spaceDelimited("<arg>").parsed.head //TODO: Single string parser!
val fileName = s"${arg}Spec"
log.info(s"Generating $fileName")
val file = (sourceManaged in Test).value / s"$fileName.scala"
IO.write(file, s"""class $fileName extends AbstractSpec""")
//sourceGenerators in Test += file
file
}
But, even though it is created in the sourceManaged directory, sbt test does not pick it up.
But, this works:
sourceGenerators in Test += Def.task {
val file = (sourceManaged in Test).value / "FooSpec.scala"
IO.write(file, s"""class FooSpec extends AbstractSpec""")
Seq(file)
}.taskValue
But, the above is not exactly what I want - I want to specify Foo as an argument.
So, is there any way to pass arguments to a sourceGenerator task? Or create a task that add something to managed sources such that it is picked up by sbt test?
Also, what is a way to iterate over all compiled sources's filenames? If I can do that, I will simply generate all the Spec.scala from the source filenames itself...
As suggested by this question, I tried this:
val genSpec = taskKey[Seq[File]]("Generate spec file")
genSpec := {
import sbt.complete.DefaultParsers._
val log = streams.value.log
val args = spaceDelimited("<arg>").parsed
args map {arg =>
val fileName = s"${arg}Suite"
log.info(s"Generating $fileName")
val file = (sourceManaged in Test).value / s"$fileName.scala"
IO.write(file, s"""class $fileName extends AbstractSuite""")
file
}
}
genSpec <<= (sourceGenerators in Test) { _.join.map(_.flatten.toList) }
But, I got this error:
error: `parsed` can only be used within an input task macro, such as := or Def.inputTask.
val args = spaceDelimited("<arg>").parsed
^
Try this:
val genSpec = inputKey[File]("Generate spec file")
genSpec := {
import sbt.complete.DefaultParsers._
val log = streams.value.log
val arg: String = spaceDelimited("<arg>").parsed.head //TODO: Single string parser!
val className = s"${arg}Spec"
val file = (sourceManaged in Test).value / s"$className.scala"
log info s"Generating $file"
IO.write(file, s"""class $className extends AbstractSpec""")
file
}
managedSources in Test ++= ((sourceManaged in Test).value ** "*Spec.scala").get

SBT triggering or detecting in a task if any sources have been recompiled

This snippet is wrong:
def bundleTo(dir: String) = Seq(
mkBundles <<= (bundle, compile in Compile) map { (fl, anal) =>
val flTarget = baseDirectory / s"app/$dir/${fl.getName}"
if (!flTarget.exists()) {
println("target did not exist copying over")
IO.copyFile(fl, flTarget)
} else if (anal.compilations.allCompilations.nonEmpty) {
println("something was recompiled, copying over")
IO.copyFile(fl, flTarget)
}
},
mkBundles <<= mkBundles.triggeredBy(compile in Compile)
)
Specifically anal.compilations.allCompilations.nonEmpty. I'd like to move a plugin into a directory only if something has changed as it triggers a bundle reload.
This snippet for SBT 13.7 will trigger the inner closure upon source change. There is probably pre-rolled logic for this in the SBT code base. You will probably need invalidation logic for SBT setting key changes and dependency updates.
myTask := {
val us = (unmanagedSources in Compile).value
val cd = streams.value.cacheDirectory / "osgi-recompile-cache"
println("bam")
val func = FileFunction.cached(cd, FilesInfo.lastModified) { par: Set[File] =>
println("boom")
par
}
func(us.toSet)
}
myTask <<= myTask.triggeredBy(compile in Compile)
Fleshed out a script to do what I need. Here it is :
import sbt._
import sbt.Keys._
import com.typesafe.sbt.osgi.OsgiKeys._
object OsgiDistUtils {
lazy val rootDirectory = SettingKey[File]("the root of the entire build")
lazy val distDirectoryName = SettingKey[String]("name for the dist directory")
lazy val distdirectory = SettingKey[File]("derived location where the OSGI dist will be constructed")
lazy val bundleDirectory = SettingKey[File]("location for the bundles")
lazy val compileBundleAndMove = TaskKey[Unit]("make bundles if needed")
val osgiDistUtildefaults = Seq(
distDirectoryName := "app",
distdirectory := rootDirectory.value / distDirectoryName.value,
compileBundleAndMove := {
val targetDirectory = bundleDirectory.value
val moduleName = name.value
val bundleFile = bundle.value
val s = streams.value
val targetFile = targetDirectory / bundleFile.getName
if(!targetDirectory.exists()) {
IO.createDirectory(targetDirectory)
} else if(!targetFile.exists()) {
s.log.info(s"module $moduleName did not exist in dist, copying over.")
IO.copyFile(bundleFile, targetFile)
} else {
val sources = (unmanagedSources in Compile).value
val cp = (managedClasspath in Compile).value
val cd = s.cacheDirectory / "osgi-recompile-cache"
FileFunction.cached(cd, FilesInfo.lastModified) { sources: Set[File] =>
s.log.info(s"Recompiling $moduleName as sources or classpath have changed.")
IO.copyFile(bundleFile, targetFile)
sources
} (sources.toSet ++ cp.seq.map(_.data).toSet)
}
},
compileBundleAndMove <<= compileBundleAndMove.triggeredBy(compile in Compile)
)
def createModuleGroup(base: File, name: String, aggregatorSettings: Seq[Def.Setting[_]], moduleSettings: Seq[Def.Setting[_]], projectDeps: Array[Project] = Array()) = {
val moduleRoot = base / name
val modules = for (x <- moduleRoot.listFiles if x.isDirectory && x.getName != "target") yield {
Project(
id = name + "-%s".format(x.getName).replace(".", "-"),
base = x,
settings = moduleSettings ++ osgiDistUtildefaults ++ Seq(
bundleDirectory := (distdirectory / name).value
)
).dependsOn(projectDeps.map(x=> ClasspathDependency(x,Some("compile"))):_*)
}
val moduleRefs = modules.map { x =>
x:ProjectReference
}
val aggregationNode = Project(
id = name,
base = moduleRoot,
settings = aggregatorSettings
).aggregate(moduleRefs: _*)
(aggregationNode, modules)
}
}