I have an SBT project that aggregates over multiple projects like this:
object ClientCore extends Build {
/**
* This is the root project
*/
lazy val rootProj = Project(id = "clientcore", base = file(".")) aggregate(
utilsProj,
commonUiProj,
spatialMathProj,
sessionManagerProj,
lobbyProj,
)
/**
* This is a utils library
*/
lazy val utilsProj = Project(id = "utils", base = file("Utils"))
/**
* A shared library for UI elements
*/
lazy val commonUiProj = Project(id = "commonui", base = file("CommonUI"))
/**
* This is a spatial math library
*/
lazy val spatialMathProj = Project(id = "spatialmath", base = file("SpatialMath"))
lazy val sessionManagerProj = Project(id = "sessionmanager", base = file("sessionManager"),
settings = buildSettings ++ assemblySettings) settings(
outputPath in assembly := new File(s"$outDir\\SessionManagerClient.jar"),
jarName in assembly := "SessionManagerClient.jar",
test in assembly := {}
) dependsOn(utilsProj)
lazy val lobbyProj = Project(id = "lobby", base = file("Lobby"),
settings = buildSettings ++ assemblySettings) settings(
outputPath in assembly := new File(s"$outDir\\Lobby.jar"),
jarName in assembly := "Lobby.jar",
test in assembly := {}
) dependsOn(utilsProj)
}
For some reason some of the projects end up with a deep nesting of 'project' folders. For example Utils might look like: 'Util/project/project/project/project/...
I'm using Intellij's SBT plugin to sync the presentation but managing the project with SBT. I'm not certain whether this is an SBT issue or an Intellij one.
Thanks for any help you can provide.
Kurt
This is an IntelliJ issue (among many others related to the SBT plugin...)
I think you may have refreshed your config somewhere before defining a module and adding that module to the root project aggregates, which has a tendency to make a mess in IntelliJ.
This can be fixed in IntelliJ:
detach your project from IntelliJ
restart IntelliJ
reimport your project
Related
My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)
I'm working on the Scala track in Exercism which means I have a lot of SBT projects in a root folder. I'd like to create a root SBT project which will automatically add new sub-projects as I download new exercises. Currently I have to add them manually, so my root build.sbt looks like this:
lazy val root = (project in file("."))
.aggregate(
hello_world,
sum_of_multiples,
robot_name)
lazy val hello_world = project in file("hello-world")
lazy val sum_of_multiples = project in file("sum-of-multiples")
lazy val robot_name = project in file("robot-name")
...but I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
Sure. It's a bit advanced use of sbt, but you can create an ad-hoc plugin that generates subprojects programmatically.
build.sbt
ThisBuild / scalaVersion := "2.12.8"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
project/build.properties
sbt.version=1.2.8
project/SubprojectPlugin.scala
import sbt._
object SubprojectPlugin extends AutoPlugin {
override val requires = sbt.plugins.JvmPlugin
override val trigger = allRequirements
override lazy val extraProjects: Seq[Project] = {
val dirs = (file(".") * ("*" -- "project" -- "target")) filter { _.isDirectory }
dirs.get.toList map { dir =>
Project(dir.getName.replaceAll("""\W""", "_"), dir)
}
}
}
Now if you start up sbt, any directories that are not named target or project will result to a subproject.
sbt:generic-root> projects
[info] In file:/private/tmp/generic-root/
[info] * generic-root
[info] hello_world
[info] robot_name
[info] sum_of_multiple
hello-world/build.sbt
To add further settings you can create build.sbt file under the directory as follows:
libraryDependencies += "commons-io" % "commons-io" % "2.6"
For some reason, our project got reorganized with the main class thrown in another module
I've specified the mainClass as below in the build.sbt but I still get a class not found error:
mainClass in Compile := Some("com.so.questions.sbt.Main")
However, this is bound to fail since it's going to look for the Main class in the src folder. However, this module lives outside of (sibling of) src:
MyScalaProject
+-MyModule
|+-src
| +-com.so.questions.sbt
| +-Main
|+-build.sbt <-- build.sbt specific to this module, currently blank
+-src
| +-<other folders>
+-build.sbt <-- build.sbt currently housing all config
How can I change the project scope in build.sbt to find and correctly load the main class?
That is, is it possible to do sbt run at the top level and have the main class be found with this structure?
It should work.
The FQCN specification for mainClass should be location independent to my understanding.
The real question that comes to mind is how you are loading your sub-module.
Here are some sbt definitions that should help point you in the right direction ( replace the <> tags with your own project Ids) :
// Define a submodule ref to be able to include it as a dependency
lazy val subModuleRef = ProjectRef(file("MyModule"),<MyModule SBT NAME>)
// Define a submodule project to be able to orchestrate it's build
lazy val subModule = Project(
id = <MyModule SBT NAME>,
base = file("MyModule"),
).addSbtFiles(file("build.sbt"))
// Define the top-level project, depending and subModule Ref for code
// inclusion and aggregating the subModule for build orchestration
lazy val scalaProject = Project(
id = <MyScalaProject NAME>,
base = file("."),
aggregate = Seq(subModule),
settings = commonSettings
).dependsOn(subModuleRef).
Let's say that you have the MyModule module/folder containing the main class and some other module called MyCoreModule (just to illustrate the whole build.sbt):
// any stuff that you want to share between modules
lazy val commonSettings = Seq(
scalaVersion := "2.12.8",
version := "1.0-SNAPSHOT"
)
lazy val root = (project in file("."))
.settings(commonSettings: _*)
.settings(
name := "parent-module"
)
.aggregate(core, app)
.dependsOn(app) // <-- here is the config that will allow you to run "sbt run" from the root project
lazy val core = project.in(file("MyCoreModule"))
.settings(commonSettings: _*)
.settings(
name := "core"
)
lazy val app = project.in(file("MyModule"))
.dependsOn(core)
.settings(commonSettings: _*)
.settings(
name := "app"
)
// define your mainClass from the "app" module
mainClass in Compile := (mainClass in Compile in app).value
Btw, sbt.version=1.2.7
I have a multi module sbt project. When I change some source code in a module, other modules don't see the changes in IntelliJ .
When I try to navigate, it goes to declaration, instead of navigating to the source it navigates to compiled jar file.
It works fine when I remove the jar from library dependencies in project settings. I think because it recompiles so works fine till next change. And sbt compiles works fine but I guess problem because of Build.scala settings, project dependencies can have order issues. Here is the dependencies;
lazy val root = Project(id = "xx-main", base = file("."), settings = commonSettings)
.aggregate(utils, models, commons, dao, te)
.dependsOn(utils, models, commons, dao)
lazy val utils = Project(id = "xx-utils", base = file("xx-utils"))
.settings(commonSettings: _*)
lazy val commons = Project(id = "xx-commons", base = file("xx-commons"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val models =
Project(id = "xx-models", base = file("xx-models"), settings = commonSettings)
.dependsOn(utils)
lazy val dao = Project(id = "xx-dao", base = file("xx-dao"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val te = Project(id = "xx-te", base = file("xx-te"))
.settings(commonSettings: _*)
.dependsOn(utils, models, dao, commons)
I have the following project definition (simplified):
object B extends Build {
lazy val root = (project in file("."))
.aggregate(commons, processor)
lazy val commons = (project in file("commons"))
lazy val processor = (project in file("processor"))
.enablePlugins(BuildInfoPlugin, BuildTag)
}
and the BuildTag plugin (also simplified to the issue at hand):
object BuildTag extends AutoPlugin {
override def requires = BuildInfoPlugin
override lazy val buildSettings = Seq(
packageOptions in (Compile, packageBin) += {
Package.ManifestAttributes(("buildinfo.package", (buildInfoPackage in Compile).value))
}
)
}
when I load the project, I get an error like:
Reference to undefined setting:
{.}/compile:buildInfoPackage from {.}/compile:packageBin::packageOptions
It looks like sbt is trying to reference the setting outside of the scope where the plugin is using it. Why might that be and how can I fix it?
The problem here was not the multi-module nature, because it is reproducible also in a single-module project.
However instead of
override lazy val buildSettings = ...
you need to use projectSettings to make the buildInfoPackage task usable.