I'm working on the Scala track in Exercism which means I have a lot of SBT projects in a root folder. I'd like to create a root SBT project which will automatically add new sub-projects as I download new exercises. Currently I have to add them manually, so my root build.sbt looks like this:
lazy val root = (project in file("."))
.aggregate(
hello_world,
sum_of_multiples,
robot_name)
lazy val hello_world = project in file("hello-world")
lazy val sum_of_multiples = project in file("sum-of-multiples")
lazy val robot_name = project in file("robot-name")
...but I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
Sure. It's a bit advanced use of sbt, but you can create an ad-hoc plugin that generates subprojects programmatically.
build.sbt
ThisBuild / scalaVersion := "2.12.8"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
project/build.properties
sbt.version=1.2.8
project/SubprojectPlugin.scala
import sbt._
object SubprojectPlugin extends AutoPlugin {
override val requires = sbt.plugins.JvmPlugin
override val trigger = allRequirements
override lazy val extraProjects: Seq[Project] = {
val dirs = (file(".") * ("*" -- "project" -- "target")) filter { _.isDirectory }
dirs.get.toList map { dir =>
Project(dir.getName.replaceAll("""\W""", "_"), dir)
}
}
}
Now if you start up sbt, any directories that are not named target or project will result to a subproject.
sbt:generic-root> projects
[info] In file:/private/tmp/generic-root/
[info] * generic-root
[info] hello_world
[info] robot_name
[info] sum_of_multiple
hello-world/build.sbt
To add further settings you can create build.sbt file under the directory as follows:
libraryDependencies += "commons-io" % "commons-io" % "2.6"
Related
I would like to be able to define a task for all projects in my sbt.build:
lazy val project1 = project.in(`.` / "project1)
...
lazy val project2 =
...
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
The problem is this definition works only when I call sbt upload, but I would like to be able to call it for each subproject: sbt project1/upload and sbt project2/upload.
Is there a way to do it, without using inputKey?
See Organizing the build:
For more advanced users, another way of organizing your build is to define one-off auto plugins in project/*.scala. By defining triggered plugins, auto plugins can be used as a convenient way to inject custom tasks and commands across all subprojects.
project/UploadPlugin.scala
package something
import sbt._
import Keys._
object UploadPlugin extends AutoPlugin {
override def requires = sbt.plugins.JvmPlugin
override def trigger = allRequirements
object autoImport {
val upload = taskKey[Unit]("upload a config file from project to server")
}
import autoImport._
override lazy val projectSettings = Seq(
upload := {
val n = name.value
println(s"uploading $n..")
}
)
}
build.sbt
Here's how you can use it:
ThisBuild / organization := "com.example"
ThisBuild / scalaVersion := "2.12.5"
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.aggregate(project1, project2)
.settings(
name := "Hello"
)
lazy val project1 = (project in file("project1"))
lazy val project2 = (project in file("project2"))
build.sbt does not have to mention anything about UploadPlugin, since it's a triggered plugin. From the shell you can call:
sbt:Hello> project1/upload
uploading project1..
[success] Total time: 0 s, completed Jul 20, 2018
sbt:Hello> project2/upload
uploading project2..
[success] Total time: 0 s, completed Jul 20, 2018
You can add the task as a setting of the project you want :
lazy val uploadTask = {
lazy val upload = taskKey[Unit]("upload a config file from project to server)
upload := {
val file = baseDirectory.value / "config.json"
...
}
}
project.in(`.` / "project1).settings(uploadTask)
I am just starting to learn sbt to build scala projects.
Here is my build.sbt file
lazy val commonSettings = Seq(
organization := "com.example",
version := "0.1.0",
scalaVersion := "2.11.7"
)
lazy val task = taskKey[Unit]("An example task")
lazy val root = project.in(file(".")).
aggregate(core).
settings(commonSettings: _*).
settings(
task := { println("Hello!") },
name := "hello",
version := "1.0"
)
lazy val core = project.in( file("SbtScalaProjectFoo") )
My project structure is as follows
SbtScalaProject
|--SbtScalaProjectFoo
|--build.sbt
|--build.sbt
When I try to run "sbt" inside SbtScalaProject I get the following
No project 'core' in 'file:/Users/asattar/Dev/work/SbtScalaProject/'
What am I missing?
Having multiple build.sbt's has proven to make problems for me, too.
I would recommend aggregating all project data into one build.sbt. If you want to modularize the build, consider moving parts into .scala-Files in (the root project's) project/ directory.
I have an sbt (0.13.1) project with a bunch of subprojects. I am generating eclipse project configurations using sbteclipse. My projects only have scala source files, so I want to remove the generated src/java folders.
I can achieve that by (redundantly) adding the following to the build.sbt of each subproject:
unmanagedSourceDirectories in Compile := (scalaSource in Compile).value :: Nil
unmanagedSourceDirectories in Test := (scalaSource in Test).value :: Nil
I tried just adding the above configuration to the root build.sbt but the eclipse command still generated the java source folders.
Is there any way to specify a configuration like this once (in the root build.sbt) and have it flow down to each subproject?
You could define the settings unscoped and then reuse them
val onlyScalaSources = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
val project1 = project.in( file( "project1" )
.settings(onlyScalaSources: _*)
val project2 = project.in( file( "project2" )
.settings(onlyScalaSources: _*)
You could also create a simple plugin (untested code)
object OnlyScalaSources extends AutoPlugin {
override def trigger = allRequirements
override lazy val projectSettings = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
}
More details about creating plugins in the plugins documentation
This is my project/Build.scala:
package sutils
import sbt._
import Keys._
object SutilsBuild extends Build {
scalaVersion in ThisBuild := "2.10.0"
val scalazVersion = "7.0.6"
lazy val sutils = Project(
id = "sutils",
base = file(".")
).settings(
test := { },
publish := { }, // skip publishing for this root project.
publishLocal := { }
).aggregate(
core
)
lazy val core = Project(
id = "sutils-core",
base = file("sutils-core")
).settings(
libraryDependencies += "org.scalaz" % "scalaz-core_2.10" % scalazVersion
)
}
This seems to be compiling my project just fine, but when I go into the console, I can't import any of the code that just got compiled?!
$ sbt console
scala> import com.github.dcapwell.sutils.validate.Validation._
<console>:7: error: object github is not a member of package com
import com.github.dcapwell.sutils.validate.Validation._
What am I doing wrong here? Trying to look at the usage, I don't see a way to say which subproject to load while in the console
$ sbt about
[info] Loading project definition from /src/sutils/project
[info] Set current project to sutils (in build file:/src/sutils/)
[info] This is sbt 0.13.1
[info] The current project is {file:/src/sutils/}sutils 0.1-SNAPSHOT
[info] The current project is built against Scala 2.10.3
[info] Available Plugins: org.sbtidea.SbtIdeaPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.3
There's the solution from #Alexey-Romanov to start the console task in the project the classes to import are in.
sbt sutils/console
There's however another solution that makes the root sutils project depend on the other core. Use the following snippet to set up the project - note dependsOn core that will bring the classes from the core project to sutils's namespace.
lazy val sutils = Project(
id = "sutils",
base = file(".")
).settings(
test := { },
publish := { }, // skip publishing for this root project.
publishLocal := { }
).aggregate(
core
).dependsOn core
BTW, you should really use a simpler build.sbt for your use case as follows:
scalaVersion in ThisBuild := "2.10.0"
val scalazVersion = "7.0.6"
lazy val sutils = project.in(file(".")).settings(
test := {},
publish := {}, // skip publishing for this root project.
publishLocal := {}
).aggregate(core).dependsOn(core)
lazy val core = Project(
id = "sutils-core",
base = file("sutils-core")
).settings(
libraryDependencies += "org.scalaz" %% "scalaz-core" % scalazVersion
)
You could make it even easier when you'd split the build to two build.sbts, each for the projects.
For my multi-project build, I'm trying to create a verify task that just results in scct:test and then scalastyle being executed in order. I would like scct:test to execute for all the subprojects, but not the top-level project. (If it executes for the top-level project, I get "timed out waiting for coverage report" from scct, since there's no source and no tests in that project.) What I had thought to do was to create verify as a task with dependencies on scct:test and scalastyle. This has turned out to be fairly baroque. Here is my Build.scala from my top-level project/ directory:
object MyBuild extends Build {
val verifyTask = TaskKey[Unit]("verify", "Compiles, runs tests via scct:test and then runs scalastyle")
val scctTestTask = (test in ScctPlugin.Scct).scopedKey
val scalastyleTask = PluginKeys.scalastyleTarget.scopedKey
lazy val root = Project("rootProject",
file("."),
settings = Defaults.defaultSettings ++
ScalastylePlugin.Settings ++
ScctPlugin.instrumentSettings ++
ScctPlugin.mergeReportSettings ++
Seq(
verifyTask in Global := {},
verifyTask <<= verifyTask.dependsOn(scctTestTask, scalastyleTask)
)
) aggregate(lift_webapp, selenium_tests)
lazy val subproject_1 = Project(id = "subproject_1", base = file("subproject_1"))
lazy val subproject_2 = Project(id = "subproject_2", base = file("subproject_2"))
}
However, the verify task only seems to exist for the root project; when I run it I don't see the same task being run in the subprojects. This is exactly the opposite of what I want; I'd like to issue sbt verify and have scct:test and scalastyle run in each of the subprojects but not in the top-level project. How might I go about doing that?
solution 1: define verifyTask in subprojects
First thing to note is that if you want some task (verify, test, etc) to run in some projects, you need to define them scoped to the subprojects. So in your case, the most straightforward thing to do this is to define verifyTask in subproject_1 and subproject_2.
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.4"
lazy val verify = taskKey[Unit]("verify")
def verifySettings = Seq(
skip in verify := false,
verify := (Def.taskDyn {
val sk = (skip in verify).value
if (sk) Def.task { println("skipping verify...") }
else (test in Test)
}).value
)
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
verifySettings,
scalaVersion in ThisBuild := "2.12.4",
skip in verify := true
)
lazy val sub1 = (project in file("sub1"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
lazy val sub2 = (project in file("sub2"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
solution 2: ScopeFilter
There was a recent Reddit thread that mentioned this question, so I'll post what I've done there.
If you want to manually aggregate on some subprojects, there's also a technique called ScopeFilter.
Note that I am using sbt 1.x here, but it should work with sbt 0.13 some minor change.
lazy val packageAll = taskKey[Unit]("package all the projects")
lazy val myTask = inputKey[Unit]("foo")
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
scalaVersion in ThisBuild := "2.12.4",
packageAll := {
(packageBin in Compile).all(nonRootsFilter).value
()
},
myTask := {
packageAll.value
}
)
lazy val sub1 = (project in file("sub1"))
lazy val sub2 = (project in file("sub2"))
def nonRootsFilter = {
import sbt.internal.inc.ReflectUtilities
def nonRoots: List[ProjectReference] =
allProjects filter {
case LocalProject(p) => p != "root"
case _ => false
}
def allProjects: List[ProjectReference] =
ReflectUtilities.allVals[Project](this).values.toList map { p =>
p: ProjectReference
}
ScopeFilter(inProjects(nonRoots: _*), inAnyConfiguration)
}
In the above, myTask depends on packageAll, which aggregates (packageBin in Compile) for all non-root subprojects.
sbt:root> myTask
[info] Packaging /Users/xxx/packageall/sub1/target/scala-2.12/sub1_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Packaging /Users/xxx/packageall/sub2/target/scala-2.12/sub2_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Feb 2, 2018 7:23:23 PM
I may be wrong, but you are defining the verify task dependency only for the current project.
Maybe you can try:
Seq(
verifyTask in Global := {},
verifyTask <<= (verifyTask in Global).dependsOn(scctTestTask, scalastyleTask)
)
Or you can add the verifyTask settings to all your modules.