sbt-assembly does not pick up configuration specific settings - scala

I am updating an old 0.7.x build file from the tool sbt that thankfully removed the reference to "simple" from its name in the meantime.
Something that once worked, does not do so any longer. I had different config entries for platform specific assembly tasks. These include specific filters that for some reason are now called assemblyExcludedJars instead of excludedJars, and specific jar names that for some reason are now called assemblyJarName instead of jarName.
Basically:
val Foo = config("foo") extend Compile
lazy val assemblyFoo = TaskKey[File]("assembly-foo")
lazy val root = Project(id = "root", base = file("."))
// .configs(Foo) // needed? doesn't change anything
.settings(
inConfig(Foo)(inTask(assembly) {
assemblyJarName := "wtf.jar"
}),
scalaVersion := "2.11.7",
assemblyFoo <<= assembly in Foo
)
Now I would expect that if I run sbt assembly-foo or sbt foo:assembly, it would produce a file wtf.jar. But I am getting the default root-assembly-0.1-snapshot.jar. The same problem happens when I try to specify assemblyExcludedJars, they are simply ignored and still included.
If I remove the inConfig it works:
lazy val root = Project(id = "root", base = file("."))
.settings(
inTask(assembly) {
assemblyJarName := "wtf.jar"
},
scalaVersion := "2.11.7",
assemblyFoo <<= assembly in Foo
)
But now I cannot use different jar names for different configurations (which is the whole point).
As described in a blog post by one of sbt's authors and the author of sbt-assembly, this should work. It was also written in this Stackoverflow question. But the example requires an antique version of sbt-assembly (0.9.0 from 2013, before auto plugins etc.) and doesn't seem to apply to the current versions.

If one defines a new configuration, one has to redefine (?) all the tasks one is about to use. Apparently for sbt-assembly, this means running baseAssemblySettings:
val Foo = config("foo") extend Compile
lazy val assemblyFoo = TaskKey[File]("assembly-foo")
lazy val root = Project(id = "root", base = file("."))
.settings(
inConfig(Foo)(baseAssemblySettings /* !!! */ ++ inTask(assembly) {
jarName := "wtf.jar"
}),
scalaVersion := "2.11.7",
assemblyFoo := (assembly in Foo).value
)
Tested with sbt 0.13.9 and sbt-assembly 0.14.1.

Related

build.sbt - iteration over sub projects for common settings in monorepo

I'm implementing a monorepo using SBT. I would like to iterate over my subprojects in order to initialize them (as the have the same configuration) and prevent code duplication.
In my build.sbt:
lazy val root = (project in file("."))
.aggregate(projects: _*)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val projects = Seq("projectA", "projectB", "projectC")
.map((projectName: String) => (project in file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName))
.project
)
I'm getting the error:
error: project must be directly assigned to a val, such as `val x = project`. Alternatively, you can use `sbt.Project.apply`
Based on the error message, I also tried to use sbt.Project.apply(projectName, file(projectName)).settings(...) instead, but I'm also facing some funny errors.
From what I understand, it seems that SBT expects me to declare as lazy val projectA = (project in file("projectA")).settings(...), which works fine but I would have to duplicate this code for all my sub projects.
Is this iteration that I try to implement even possible?
Utility method might help with some of the duplication, for example
def createProject(projectName: String) = {
Project(projectName, file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName)
)
}
lazy val projectA = createProject("projectA")
lazy val projectB = createProject("projectB")
lazy val projectC = createProject("projectC")
lazy val root = (project in file("."))
.aggregate(projectA, projectB, projectB)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)

Per-project tasks in SBT

My .sbt file looks something like this:
lazy val common = (project in file("./common"))
.settings(
// other settings
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;assembly;foo")
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;compile;bar")
)
Additionally I have two tasks foo and bar which are only valid in their respective projects.
My tests show that upon calling build - no matter which project I am in - both aliases are being called.
And for tasks, the keys can already be only defined at top-level of the .sbt file (e.g. val foo = taskKey[Unit]("Does foo")).
I want to know how to correctly implement tasks and command aliases on project level.
Is that possible?
The problem you are having is with alias in sbt. When an alias is defined, it is attached to scope GlobalScope in the form of a command and therefore available for all sub-projects. When you do multiple definitions of aliases with addCommandAlias, the last execution wins as every executions removes previously created alias with the same name.
You can see the defined alias by running sbt alias and it will print that there is only one build alias.
You could achieve separations of build by introducing it as a taskKey
lazy val build = taskKey[Unit]("Builds")
lazy val root = (project in file("."))
.aggregate(one, two) // THIS IS NEED TO MAKE build TASK AVAILABLE IN ROOT
lazy val common = (project in file("./common"))
.settings(
//SOME KEYS
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, Compile / compile).value
}
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, assembly).value
}
)
EDIT: Added Def.sequential as suggested by #laughedelic in the comments

How can I specify a mainClass in build.sbt that resides in another module?

For some reason, our project got reorganized with the main class thrown in another module
I've specified the mainClass as below in the build.sbt but I still get a class not found error:
mainClass in Compile := Some("com.so.questions.sbt.Main")
However, this is bound to fail since it's going to look for the Main class in the src folder. However, this module lives outside of (sibling of) src:
MyScalaProject
+-MyModule
|+-src
| +-com.so.questions.sbt
| +-Main
|+-build.sbt <-- build.sbt specific to this module, currently blank
+-src
| +-<other folders>
+-build.sbt <-- build.sbt currently housing all config
How can I change the project scope in build.sbt to find and correctly load the main class?
That is, is it possible to do sbt run at the top level and have the main class be found with this structure?
It should work.
The FQCN specification for mainClass should be location independent to my understanding.
The real question that comes to mind is how you are loading your sub-module.
Here are some sbt definitions that should help point you in the right direction ( replace the <> tags with your own project Ids) :
// Define a submodule ref to be able to include it as a dependency
lazy val subModuleRef = ProjectRef(file("MyModule"),<MyModule SBT NAME>)
// Define a submodule project to be able to orchestrate it's build
lazy val subModule = Project(
id = <MyModule SBT NAME>,
base = file("MyModule"),
).addSbtFiles(file("build.sbt"))
// Define the top-level project, depending and subModule Ref for code
// inclusion and aggregating the subModule for build orchestration
lazy val scalaProject = Project(
id = <MyScalaProject NAME>,
base = file("."),
aggregate = Seq(subModule),
settings = commonSettings
).dependsOn(subModuleRef).
Let's say that you have the MyModule module/folder containing the main class and some other module called MyCoreModule (just to illustrate the whole build.sbt):
// any stuff that you want to share between modules
lazy val commonSettings = Seq(
scalaVersion := "2.12.8",
version := "1.0-SNAPSHOT"
)
lazy val root = (project in file("."))
.settings(commonSettings: _*)
.settings(
name := "parent-module"
)
.aggregate(core, app)
.dependsOn(app) // <-- here is the config that will allow you to run "sbt run" from the root project
lazy val core = project.in(file("MyCoreModule"))
.settings(commonSettings: _*)
.settings(
name := "core"
)
lazy val app = project.in(file("MyModule"))
.dependsOn(core)
.settings(commonSettings: _*)
.settings(
name := "app"
)
// define your mainClass from the "app" module
mainClass in Compile := (mainClass in Compile in app).value
Btw, sbt.version=1.2.7

Project aggregation in sbt

I am just starting to learn sbt to build scala projects.
Here is my build.sbt file
lazy val commonSettings = Seq(
organization := "com.example",
version := "0.1.0",
scalaVersion := "2.11.7"
)
lazy val task = taskKey[Unit]("An example task")
lazy val root = project.in(file(".")).
aggregate(core).
settings(commonSettings: _*).
settings(
task := { println("Hello!") },
name := "hello",
version := "1.0"
)
lazy val core = project.in( file("SbtScalaProjectFoo") )
My project structure is as follows
SbtScalaProject
|--SbtScalaProjectFoo
|--build.sbt
|--build.sbt
When I try to run "sbt" inside SbtScalaProject I get the following
No project 'core' in 'file:/Users/asattar/Dev/work/SbtScalaProject/'
What am I missing?
Having multiple build.sbt's has proven to make problems for me, too.
I would recommend aggregating all project data into one build.sbt. If you want to modularize the build, consider moving parts into .scala-Files in (the root project's) project/ directory.

Can I access my Scala app's name and version (as set in SBT) from code?

I am building an app with SBT (0.11.0) using a Scala build definition like so:
object MyAppBuild extends Build {
import Dependencies._
lazy val basicSettings = Seq[Setting[_]](
organization := "com.my",
version := "0.1",
description := "Blah",
scalaVersion := "2.9.1",
scalacOptions := Seq("-deprecation", "-encoding", "utf8"),
resolvers ++= Dependencies.resolutionRepos
)
lazy val myAppProject = Project("my-app-name", file("."))
.settings(basicSettings: _*)
[...]
I'm packaging a .jar at the end of the process.
My question is a simple one: is there a way of accessing the application's name ("my-app-name") and version ("0.1") programmatically from my Scala code? I don't want to repeat them in two places if I can help it.
Any guidance greatly appreciated!
sbt-buildinfo
I just wrote sbt-buildinfo.
After installing the plugin:
lazy val root = (project in file(".")).
enablePlugins(BuildInfoPlugin).
settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoPackage := "foo"
)
Edit: The above snippet has been updated to reflect more recent version of sbt-buildinfo.
It generates foo.BuildInfo object with any setting you want by customizing buildInfoKeys.
Ad-hoc approach
I've been meaning to make a plugin for this, (I wrote it) but here's a quick script to generate a file:
sourceGenerators in Compile <+= (sourceManaged in Compile, version, name) map { (d, v, n) =>
val file = d / "info.scala"
IO.write(file, """package foo
|object Info {
| val version = "%s"
| val name = "%s"
|}
|""".stripMargin.format(v, n))
Seq(file)
}
You can get your version as foo.Info.version.
Name and version are inserted into manifest. You can access them using java reflection from Package class.
val p = getClass.getPackage
val name = p.getImplementationTitle
val version = p.getImplementationVersion
You can also generate a dynamic config file, and read it from scala.
// generate config (to pass values from build.sbt to scala)
Compile / resourceGenerators += Def.task {
val file = baseDirectory.value / "conf" / "generated.conf"
val contents = "app.version=%s".format(version.value)
IO.write(file, contents)
Seq(file)
}.taskValue
When you run sbt universal:packageBin the file will be there.