I've been searching if this is possible for a while with little success.
Using SBT, can you create a sub-project programmatically, without explicitly assigning each project to it's own val?
My current project structure looks something like this:
root/
common/ <--- This is another sub-project that others dependOn
project/
build.scala
src/main/scala
apps/ <--- sub-projects live here
Sub1/
Sub2/
Sub1 and Sub2 are both their own SBT projects.
My first attempt to link these projects together looked like this:
// root/project/build.scala
import sbt._
import Keys._
object build extends Build {
lazy val common = project /* Pseudo-code */
val names = List("Sub1", "Sub2")
lazy val deps = names map { name =>
Project(id = name, base = file(s"apps/$name")).dependsOn(common)
}
lazy val finalDeps = common :: deps
lazy val root = project.in(file(".")).aggregate(finalDeps.map(sbt.Project.projectToRef) :_*)
.dependsOn(finalDeps.map(ClassPathDependency(_, None)) :_*)
}
However, because SBT uses reflection to build it's projects and sub-projects, this doesn't work.
It only works if each sub-project is stated explicitly:
lazy val Sub1 = project.in(file("apps/Sub1"))
So the question:
Is there a way to programmatically build sub-project dependencies in SBT?
Sbt allows for making a build definition for the build itself:
http://www.scala-sbt.org/release/docs/Getting-Started/Full-Def.html
You can try creating a project/project/build.scala file that contains a source generator, something like this:
// project/project/build.scala
sourceGenerators in Compile <+= sourceManaged in Compile map { out =>
Generator.generate(out / "generated")
}
EDIT: You should implement the Generator object yourself.
This source generator will in turn scan the topmost apps folder and create a source for an object that contains all the subprojects.
// project/subprojects.scala
// This is autogenerated from the source generator
object Subprojects{
lazy val Sub1 = project.in(file("apps/Sub1"))
lazy val Sub2 = project.in(file("apps/Sub2"))
lazy val all = Seq(Sub1,Sub2)
}
Now in your main build.scala just write:
// project/build.scala
lazy val root = project.in(file("."))
.aggregate(Subprojects.all.map(sbt.Project.projectToRef) :_*)
.dependsOn(Subprojects.all.map(ClassPathDependency(_, None)) :_*)
I didn't run all this through a compiler so some errors are possible but the principle should work.
EDIT: I created a repo on Github where I implemented the solution. Go there and see how it is done.
https://github.com/darkocerdic/sbt-auto-subprojects
Related
My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)
I have the following project definition (simplified):
object B extends Build {
lazy val root = (project in file("."))
.aggregate(commons, processor)
lazy val commons = (project in file("commons"))
lazy val processor = (project in file("processor"))
.enablePlugins(BuildInfoPlugin, BuildTag)
}
and the BuildTag plugin (also simplified to the issue at hand):
object BuildTag extends AutoPlugin {
override def requires = BuildInfoPlugin
override lazy val buildSettings = Seq(
packageOptions in (Compile, packageBin) += {
Package.ManifestAttributes(("buildinfo.package", (buildInfoPackage in Compile).value))
}
)
}
when I load the project, I get an error like:
Reference to undefined setting:
{.}/compile:buildInfoPackage from {.}/compile:packageBin::packageOptions
It looks like sbt is trying to reference the setting outside of the scope where the plugin is using it. Why might that be and how can I fix it?
The problem here was not the multi-module nature, because it is reproducible also in a single-module project.
However instead of
override lazy val buildSettings = ...
you need to use projectSettings to make the buildInfoPackage task usable.
I'm write a SBT task, which can output the dependencies information, grouped by project (say a SBT project has multi projects)
I know there is a sbt-dependency-graph plugin, but I can use it directly, because I want to generate a json file, but that plugin just output the dependency tree to console, without returning an data object, I can't easily get the data I want.
I found the update task returns a UpdateReport which contains a lot of information I want, but it only belong to the current project. In command line, if I want to know the information of all project, I can manually show all the projects by projects command, and view them one by one by someproject/update.
But how to do the same in a SBT task? I tried:
val reports = projects.toList.map(prj => (update in prj).value)
It reports:
[error] /Users/me/workspace/sbt-test/project/Build.scala:51: Illegal dynamic reference: prj
[error] val reports = projects.toList.map(prj => (update in prj).value)
[error] ^
[error] one error found
How to fix it?
More code:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val allUpdate = taskKey[Unit]("show update reports of all projects")
lazy val core = project
lazy val web = project
lazy val allUpdateDef = allUpdate := {
val reports = projects.toList.map(prj => (update in prj).value)
println(reports)
}
lazy val root = (project in file("."))
.settings(
allUpdateDef
)
}
After checking the document: http://www.scala-sbt.org/0.13/docs/Tasks.html, I found the solution:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val groupByProject: Def.Initialize[Task[(String, UpdateReport)]] =
Def.task {
(thisProject.value.id, (update in thisProject).value)
}
lazy val filter = ScopeFilter(inAnyProject, inAnyConfiguration)
updateByProject := {
val subProjects = groupByProject.all(filter).value.map { case ( projectName, updateReport) =>
...
}
}
}
I am creating a multi-module sbt project, with following structure:
<root>
----build.sbt
----project
----Build.scala
----plugins.sbt
----common
----LoggingModule
LoggingModule is a Play Framework project, while common is a simple Scala project.
In plugins.sbt:
resolvers += "Typesafe repo" at "http://repo.typesafe.com/typesafe/releases/"
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.3")
While I have this in build.sbt, all works fine and it recognises PlayScala:
name := "Multi-Build"
lazy val root = project.in(file(".")).aggregate(common, LoggingModule).dependsOn(common, LoggingModule)
lazy val common = project in file("common")
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
However as soon I put this in project/Build.scala instead of `build.sbt' as follows:
object RootBuild extends Build {
lazy val root = project.in(file("."))
.aggregate(common, LoggingModule)
.dependsOn(common, LoggingModule)
lazy val common = project in file("common")
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
...//other settings
}
it generates error as:
not found: value PlayScala
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
^
How to solve the issue?
It's just a missing import.
In .sbt files, some things are automatically imported by default: contents of objects extending Plugin, and (>= 0.13.5) autoImport fields in AutoPlugins. This is the case of PlayScala.
In a Build.scala file, normal Scala import rules apply. So you have to import things a bit more explicitly. In this case, you need to import play.PlayScala (or use .enabledPlugins(play.PlayScala) directly).
I have a project/build.scala file that defines a root project and a bunch of sub projects:
lazy val root = Project(
id="root",
base=file(".")).aggregate(subA, subB).enablePlugins(MyPlugin)
lazy val subA = Project(
id="subA",
base=file("a"))
lazy val subB = Project(
id="subB",
base=file("b"))
How do I make MyPlugin available in subA and subB without specifying it on each of them? I just want them to inherit the plugins from the root project.
Someone in IRC suggested overriding projects in my build object in build.scala:
override def projects = super.projects map { _.enablePlugins(...) }