sbt: generating shared sources in cross-platform project - scala

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?

Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

Related

Different project on the same base-directory (or share a single file between projects)

My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)

How to create a generic SBT root project with varying sub projects

I'm working on the Scala track in Exercism which means I have a lot of SBT projects in a root folder. I'd like to create a root SBT project which will automatically add new sub-projects as I download new exercises. Currently I have to add them manually, so my root build.sbt looks like this:
lazy val root = (project in file("."))
.aggregate(
hello_world,
sum_of_multiples,
robot_name)
lazy val hello_world = project in file("hello-world")
lazy val sum_of_multiples = project in file("sum-of-multiples")
lazy val robot_name = project in file("robot-name")
...but I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
I'd like to avoid having to add every project manually. Is there a way to add new projects automatically?
Sure. It's a bit advanced use of sbt, but you can create an ad-hoc plugin that generates subprojects programmatically.
build.sbt
ThisBuild / scalaVersion := "2.12.8"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
project/build.properties
sbt.version=1.2.8
project/SubprojectPlugin.scala
import sbt._
object SubprojectPlugin extends AutoPlugin {
override val requires = sbt.plugins.JvmPlugin
override val trigger = allRequirements
override lazy val extraProjects: Seq[Project] = {
val dirs = (file(".") * ("*" -- "project" -- "target")) filter { _.isDirectory }
dirs.get.toList map { dir =>
Project(dir.getName.replaceAll("""\W""", "_"), dir)
}
}
}
Now if you start up sbt, any directories that are not named target or project will result to a subproject.
sbt:generic-root> projects
[info] In file:/private/tmp/generic-root/
[info] * generic-root
[info] hello_world
[info] robot_name
[info] sum_of_multiple
hello-world/build.sbt
To add further settings you can create build.sbt file under the directory as follows:
libraryDependencies += "commons-io" % "commons-io" % "2.6"

How can I specify a mainClass in build.sbt that resides in another module?

For some reason, our project got reorganized with the main class thrown in another module
I've specified the mainClass as below in the build.sbt but I still get a class not found error:
mainClass in Compile := Some("com.so.questions.sbt.Main")
However, this is bound to fail since it's going to look for the Main class in the src folder. However, this module lives outside of (sibling of) src:
MyScalaProject
+-MyModule
|+-src
| +-com.so.questions.sbt
| +-Main
|+-build.sbt <-- build.sbt specific to this module, currently blank
+-src
| +-<other folders>
+-build.sbt <-- build.sbt currently housing all config
How can I change the project scope in build.sbt to find and correctly load the main class?
That is, is it possible to do sbt run at the top level and have the main class be found with this structure?
It should work.
The FQCN specification for mainClass should be location independent to my understanding.
The real question that comes to mind is how you are loading your sub-module.
Here are some sbt definitions that should help point you in the right direction ( replace the <> tags with your own project Ids) :
// Define a submodule ref to be able to include it as a dependency
lazy val subModuleRef = ProjectRef(file("MyModule"),<MyModule SBT NAME>)
// Define a submodule project to be able to orchestrate it's build
lazy val subModule = Project(
id = <MyModule SBT NAME>,
base = file("MyModule"),
).addSbtFiles(file("build.sbt"))
// Define the top-level project, depending and subModule Ref for code
// inclusion and aggregating the subModule for build orchestration
lazy val scalaProject = Project(
id = <MyScalaProject NAME>,
base = file("."),
aggregate = Seq(subModule),
settings = commonSettings
).dependsOn(subModuleRef).
Let's say that you have the MyModule module/folder containing the main class and some other module called MyCoreModule (just to illustrate the whole build.sbt):
// any stuff that you want to share between modules
lazy val commonSettings = Seq(
scalaVersion := "2.12.8",
version := "1.0-SNAPSHOT"
)
lazy val root = (project in file("."))
.settings(commonSettings: _*)
.settings(
name := "parent-module"
)
.aggregate(core, app)
.dependsOn(app) // <-- here is the config that will allow you to run "sbt run" from the root project
lazy val core = project.in(file("MyCoreModule"))
.settings(commonSettings: _*)
.settings(
name := "core"
)
lazy val app = project.in(file("MyModule"))
.dependsOn(core)
.settings(commonSettings: _*)
.settings(
name := "app"
)
// define your mainClass from the "app" module
mainClass in Compile := (mainClass in Compile in app).value
Btw, sbt.version=1.2.7

Get assets to compile when running via "activator run" in Play

I am using Sass as my CSS preprocesser, and I'm trying to have it run via the asset pipeline. I've tried implementing this sassTask as a source file task and as a web asset task, but I'm running into problems both ways.
If I run Sass as a source task (see below), it gets triggered during activator run when a page is requested and updated files are found upon page reloads. The problem I'm running into is that the resulting CSS files are all getting dumped directly into target/web/public/main/lib, instead of into the subdirectories reflecting the ones they are getting built into under the resources-managed directory. I can't figure out how to make this happen.
Instead, I tried implementing Sass compilation as a web asset task (see below). Working this way, as far as I can tell, resources-managed does not come into play, and so I compile my files directly into target/web/public/main/lib. I'm sure I'm not doing this dynamically enough, but I don't know how to do it any better. But the biggest problem here is that the pipeline apparently does not run when working through activator run. I can get it to run using activator stage, but I really need this to work in the regular development workflow so that I can change style files as the dev server is running, same as with Scala files.
I have tried combing through these forums, through the sbt-web docs, and through some of the existing plugins, but I am finding this process to be highly frustrating, due to the complexity of SBT and the opaqueness of what is actually happening in the build process.
Sass compilation as a source file task:
lazy val sassTask = TaskKey[Seq[java.io.File]]("sassTask", "Compiles Sass files")
sassTask := {
import sys.process._
val x = (WebKeys.nodeModules in Assets).value
val sourceDir = (sourceDirectory in Assets).value
val targetDir = (resourceManaged in Assets).value
Seq("sass", "-I", "target/web/web-modules/main/webjars/lib/susy/sass", "--update", s"$sourceDir:$targetDir").!
val sources = sourceDir ** "*.scss"
val mappings = sources pair relativeTo(sourceDir)
val renamed = mappings map { case (file, path) => file -> path.replaceAll("scss", "css") }
val copies = renamed map { case (file, path) => file -> targetDir / path }
copies map (_._2)
}
sourceGenerators in Assets <+= sassTask
Sass compilation as web asset task:
lazy val sassTask = taskKey[Pipeline.Stage]("Compiles Sass files")
sassTask := {
(mappings: Seq[PathMapping]) =>
import sys.process._
val sourceDir = (sourceDirectory in Assets).value
val targetDir = target.value / "web" / "public" / "main"
val libDir = (target.value / "web" / "web-modules" / "main" / "webjars" / "lib" / "susy" / "sass").toString
Seq("sass", "-I", libDir, "--update", s"$sourceDir:$targetDir").!
val sources = sourceDir ** "*.scss"
val mappings = sources pair relativeTo(sourceDir)
val renamed = mappings map { case (file, path) => file -> path.replaceAll("scss", "css") }
renamed
}
pipelineStages := Seq(sassTask)
I think that according to the documentation related to the Asset Pipeline, a Source File task is a way to go:
Examples of source file tasks as plugins are CoffeeScript, LESS and
JSHint. Some of these take a source file and produce a target web
asset e.g. CoffeeScript produces JS files. Plugins in this category
are mutually exclusive to each other in terms of their function i.e.
only one CoffeeScript plugin will take CoffeeScript sources and
produce target JS files. In summary, source file plugins produce web
assets.
I think what you try to achieve falls into this category.
TL;DR; - build.sbt
val sassTask = taskKey[Seq[File]]("Compiles Sass files")
val sassOutputDir = settingKey[File]("Output directory for Sass generated files")
sassOutputDir := target.value / "web" / "sass" / "main"
resourceDirectories in Assets += sassOutputDir.value
sassTask := {
val sourceDir = (sourceDirectory in Assets).value
val outputDir = sassOutputDir.value
val sourceFiles = (sourceDir ** "*.scss").get
Seq("sass", "--update", s"$sourceDir:$outputDir").!
(outputDir ** "*.css").get
}
sourceGenerators in Assets += sassTask.taskValue
Explanation
Assuming you have sass file in a app/assets/<whatever> directory, and that you want to create css files in web/public/main/<whatever> directory, this is what you could do.
Create a task, which will read in files in the app/assets/<whatever> directory and subdirectories, and output them to our defined sassOutputDir.
val sassTask = taskKey[Seq[File]]("Compiles Sass files")
val sassOutputDir = settingKey[File]("Output directory for Sass generated files")
sassOutputDir := target.value / "web" / "sass" / "main"
resourceDirectories in Assets += sassOutputDir.value
sassTask := {
val sourceDir = (sourceDirectory in Assets).value
val outputDir = sassOutputDir.value
val sourceFiles = (sourceDir ** "*.scss").get
Seq("sass", "--update", s"$sourceDir:$outputDir").!
(outputDir ** "*.css").get
}
This is not enough though. If you want to keep the directory structure you have to add your sassOutputDir to the resourceDirectories in Assets. This is because mappings in sbt-web are declared like this:
mappings := {
val files = (sources.value ++ resources.value ++ webModules.value) ---
(sourceDirectories.value ++ resourceDirectories.value ++ webModuleDirectories.value)
files pair relativeTo(sourceDirectories.value ++ resourceDirectories.value ++ webModuleDirectories.value) | flat
}
which means that all unmapped files are mapped using an alternative flat strategy. However the fix for it is simple, just add this to your build.sbt
resourceDirectories in Assets += sassOutputDir.value
This will make sure the directory structure is preserved.

Can I access my Scala app's name and version (as set in SBT) from code?

I am building an app with SBT (0.11.0) using a Scala build definition like so:
object MyAppBuild extends Build {
import Dependencies._
lazy val basicSettings = Seq[Setting[_]](
organization := "com.my",
version := "0.1",
description := "Blah",
scalaVersion := "2.9.1",
scalacOptions := Seq("-deprecation", "-encoding", "utf8"),
resolvers ++= Dependencies.resolutionRepos
)
lazy val myAppProject = Project("my-app-name", file("."))
.settings(basicSettings: _*)
[...]
I'm packaging a .jar at the end of the process.
My question is a simple one: is there a way of accessing the application's name ("my-app-name") and version ("0.1") programmatically from my Scala code? I don't want to repeat them in two places if I can help it.
Any guidance greatly appreciated!
sbt-buildinfo
I just wrote sbt-buildinfo.
After installing the plugin:
lazy val root = (project in file(".")).
enablePlugins(BuildInfoPlugin).
settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoPackage := "foo"
)
Edit: The above snippet has been updated to reflect more recent version of sbt-buildinfo.
It generates foo.BuildInfo object with any setting you want by customizing buildInfoKeys.
Ad-hoc approach
I've been meaning to make a plugin for this, (I wrote it) but here's a quick script to generate a file:
sourceGenerators in Compile <+= (sourceManaged in Compile, version, name) map { (d, v, n) =>
val file = d / "info.scala"
IO.write(file, """package foo
|object Info {
| val version = "%s"
| val name = "%s"
|}
|""".stripMargin.format(v, n))
Seq(file)
}
You can get your version as foo.Info.version.
Name and version are inserted into manifest. You can access them using java reflection from Package class.
val p = getClass.getPackage
val name = p.getImplementationTitle
val version = p.getImplementationVersion
You can also generate a dynamic config file, and read it from scala.
// generate config (to pass values from build.sbt to scala)
Compile / resourceGenerators += Def.task {
val file = baseDirectory.value / "conf" / "generated.conf"
val contents = "app.version=%s".format(version.value)
IO.write(file, contents)
Seq(file)
}.taskValue
When you run sbt universal:packageBin the file will be there.