Include generated resources in a jar (SBT) - scala

I've been writing an SBT plugin that generates resources into resource_managed. I'm now looking to include these generated resources in the generated jar as the SBT docs detail:
Generating resources:
By default, generated resources are not included in the packaged source artifact. To do so, add them as you would other mappings. See Adding files to a package
I've read the docs but honestly how to do this I can't figure out. Can anyone explain it or point me to another project that does this so I can see how they do it?

First just to clarify, they are included in jars containing compiled classes. They are not included in jars containing sources.
By default, generated resources are not included in the packaged
source artifact.
For packageBin the generated files should already be included - just make sure you return all generated files from the generator method. Assuming you want to package them in the sources artifact, this is what you have to do.
Let's assume you have a generator that generates a property file.
lazy val generatePropertiesTask = Def.task {
val file = (Compile / resourceManaged).value / "stack-overflow" / "res.properties"
val contents = s"name=${name.value}\nversion=${version.value}"
IO.write(file, contents)
Seq(file)
}
resourceGenerators in Compile += generatePropertiesTask.taskValue
To include that in the generated sources you have to tell sbt to where the res.properties must be copied in the generated sources artefact. The task, which generates the packaged sources is called packageSrc, therefore you have to set mappings scoped to that task.
mappings in (Compile, packageSrc) += {
((resourceManaged in Compile).value / "stack-overflow" / "res.properties") -> "path/in/jar/res.properties"
}
Because your generator can generate many tasks, and mapping each by hand would be a tedious task, sbt gives you an utility to map multiple paths at once.
mappings in (Compile, packageSrc) ++= {
val allGeneratedFiles = ((resourceManaged in Compile).value ** "*") filter { _.isFile }
allGeneratedFiles.get pair relativeTo((resourceManaged in Compile).value)
}
The first line finds all generated files using path finders and second line maps them to their path in the target jar.

Related

How do I create an sbt task to generate code, then include these generated managed sources in my root project?

I would like to have a sbt task that I can run to generate some code. I don't want to generate this with each run, just manually run this task once in awhile. I created a skeleton project to explain (https://github.com/jinyk/sbtmanagedsrc).
build.sbt:
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode := genSomeCodeTask.value)
/////////////////////////////////////////////////////////////
// fugly way to get managed sources compiled along with main
.settings(unmanagedSourceDirectories in Compile += baseDirectory.value / "target/scala-2.11/src_managed/")
/////////////////////////////////////////////////////////////
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
lazy val genSomeCodeTask = Def.task {
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Seq(file)
}
So with the build.sbt above I can run sbt gensomecode which creates
target/scala-2.11/src_managed/main/SomeGenCode.scala the default place that sbt puts "managed sources."
I would like to make this SomeGenCode available to the root project.
src/main/scala/Main.scala:
object Main extends App {
SomeGenCode.doSomething()
}
The only thing I can figure out to do is to include the default sourceManaged directory in the root project's unmanagedSourceDirectories (see build.sbt:line 4 aka the line below the fugly way... comment). This is ugly as hell and doesn't seem like it's how managed sources are supposed to be handled.
I'm probably not understanding something basic about sbt's managed sources concept or how to handle the situation of creating an sbt task to generate sources.
What am I missing?
There are three options that I am familiar with:
Generate into the unmanaged source directories.
Generate on every run, by adding sourceGenerators in Compile <+= gensomecode
Similar to (2), but use caching so it doesn't generate the file on every compile. Full example below.
In this example, the cache is based on the content of build.sbt, so whenever that file is changed it will regenerate the file.
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode <<= genSomeCodeTask)
sourceGenerators in Compile <+= genSomeCodeTask
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
def generateFile(sourceManaged: java.io.File) = {
val file = sourceManaged / "main" / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Set(file)
}
def genSomeCodeTask = (sourceManaged in Compile, streams).map {
(sourceManaged, streams) =>
val cachedCompile = FileFunction.cached(
streams.cacheDirectory / "mything",
inStyle = FilesInfo.lastModified,
outStyle = FilesInfo.exists) {
(in: Set[java.io.File]) =>
generateFile(sourceManaged)
}
cachedCompile(Set(file("build.sbt"))).toSeq
}
I hope I wasn't too late for the answer, but let's look at this section about Unmanaged vs Managed files
Classpaths, sources, and resources are separated into two main categories: unmanaged and managed. Unmanaged files are manually created files that are outside of the control of the build. They are the inputs to the build. Managed files are under the control of the build. These include generated sources and resources as well as resolved and retrieved dependencies and compiled classes.
It seems that the key difference between "Unmanaged vs Managed" is "Manually vs Automatically". Now, if we look at documentation for "generating files". We will notice immediately that it means "generating files automatically", since generating files will happen at sbt compile.
Compile / sourceGenerators += <task of type Seq[File]>.taskValue
It makes sense. Since anything that happened during sbt compile should be removed during sbt clean.
Now, from your code below, It seems that you were trying to generate an unmanaged source file (you were not using sourceGenerators, didn't you?), to the managed source file directory. The most obvious problem with this is, your source file will be removed every time you call sbt clean, so you have to run this task again to get this file back (worse, you have to run the task manually, opposed to having the sbt compile do it for you.), thus defeating your purpose of doing it manually once in a while.
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
To fix this, you have to manually generate files to unmanaged source, which is basically your source code directory (it depends -- mine is "/app"). Yet, you have to annotate it somehow that these files are generated by some means. My solution is something like:
val file = (scalaSource in Compile).value / "generated" / "SomeGenCode.scala"
Hope this help!

Prevent looping recompilation in SBT

I am using SBT to build my scala project. AFter the compilation of a submodule which sues fastOptJS, I need to push the compiled files to another module within the same project, I designed a custom command fastOptCopy to do so.
lazy val copyjs = TaskKey[Unit]("copyjs", "Copy javascript files to public directory")
copyjs := {
val outDir = baseDirectory.value / "public/js"
val inDir = baseDirectory.value / "js/target/scala-2.11"
val files = Seq("js-fastopt.js", "js-fastopt.js.map", "js-jsdeps.js") map { p => (inDir / p, outDir / p) }
IO.copy(files, true)
}
addCommandAlias("fastOptCopy", ";fastOptJS;copyjs")
However, when I enter into the sbt console and type
~fastOptCopy
it keeps compiling, copying, compiling, copying, ... in an infinite loop. I guess that because I am copying the files, it thinks that the sources have changed and retriggers compilation.
How can I prevent this?
You can exclude specified files from watchSources in sbt configuration
http://www.scala-sbt.org/0.13/docs/Triggered-Execution.html
watchSources defines the files for a single project that are monitored
for changes. By default, a project watches resources and Scala and
Java sources.
Here is a similar question:
How to not watch a file for changes in Play Framework
watchSources := watchSources.value.filter { _.getName != "BuildInfo.scala" }

Scala.js compilation destination

I'm working on a Scala.js cross project where the jvm folder represents my server application and jsrepresents my scala.js code.
Whenever i compile my scala.js code via sbt crossJS/fastOptJS the compiled JS ends up in ./js/target/scala-2.11/web-fastopt.js.
I need to have this compiled JS file accessible in the resources of the server project in the jvm folder, so i can server it through my web application. I think i have to do something with artifactPath but i can't seem to get any results from my experiments thus far.
You can simply set the artifactPath of the fastOptJS task (or the fullOptJS task) to the (managed) resources directory of your JVM project:
// In the JS project's settings
artifactPath in fastOptJS in Compile :=
(resourceManaged in jvm in Compile).value /
((moduleName in fastOptJS).value + "-fastopt.js"))
This will put it in the directory, if the you run the fastOptJS task. However, it will not be included in sbt's resources task and it will not automatically be triggered, if you launch your server. Therefore:
// In the JVM project's settings
resources in Compile += (fastOptJS in js).value.data
A couple of notes:
The first step is only necessary, if your web-server does only serve specific directories. Otherwise the second one is enough, as this adds the file to the resources already; where it lies is secondary.
Setting the crossTarget, as in #ochrons' answer will also output all the .class and .sjsir files in the resource directory.
Have a look at Vincent Munier's sbt-play-scalajs for out-of-the-box sbt-web / Scala.js integration (it follows a slightly different approach: It copies the file from the js project, rather than directly placing it in the JVM project. Useful if you have multiple JVM projects).
You can configure the Scala.js SBT plugin to output the JavaScript file in folder of your choosing. For example like this:
// configure a specific directory for scalajs output
val scalajsOutputDir = Def.settingKey[File]("directory for javascript files output by scalajs")
// make all JS builds use the output dir defined later
lazy val js2jvmSettings = Seq(fastOptJS, fullOptJS, packageJSDependencies) map { packageJSKey =>
crossTarget in(js, Compile, packageJSKey) := scalajsOutputDir.value
}
// instantiate the JVM project for SBT with some additional settings
lazy val jvm: Project = sharedProject.jvm.settings(js2jvmSettings: _*).settings(
// scala.js output is directed under "web/js" dir in the jvm project
scalajsOutputDir := (classDirectory in Compile).value / "web" / "js",
This will also store -jsdeps.js and .js.map files in the same folder, in case you want to use those in your web app.
For a more complete example, check out this tutorial which addresses many other issues of creating a more complex Scala.js application.

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

SBT: Exclude class from Jar

I am converting a legacy jar project to SBT and for strange reasons that are not easily solved, this project comes with "javax/servlet/Servlet.class" inside it. So I need to somehow exclude this class from the jar file generated by package-bin. How do I accomplish this ?. Preferably I would like to exclude using a wildcard (i.e. javax.*).
The SBT assembly plugin does look like it has features that will do this, but I am worried that relying on sbt assembly means that my jar project will not work in a muliti module project (i.e. if I include it as a dependency in a war file then the war projects needs to be told to run assembly on the dependent jar project rather than package-bin - but I may be mistaken here).
Each task declares the other tasks and settings that it uses. You can use inspect to determine these inputs as described on Inspecting Settings and in a recent tutorial-style blog post by John Cheng.
In this case, the relevant task used by packageBin is mappings. The mappings task collects the files to be included in the jar and maps them to the path in the jar. Some background is explained on Mapping Files, but the result is that mappings produces a value of type Seq[(File, String)]. Here, the File is the input file providing the content and the String is the path in the jar.
So, to modify the mappings for the packageBin task, filter out the paths from the default mappings that you don't want to include:
mappings in (Compile,packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath != "javax/servlet/Servlet.class"
}
}
mappings in (Compile,packageBin) selects the mappings for the main package task (as opposed to test sources or the packageSrc task).
x ~= f means "set x to the result of applying function f to the previous value of x". (See More About Settings for details.)
The filter drops all pairs where the path corresponds to the Servlet class.
I came up with this solution, it defines a new compile task which depends on the previous compile task (thus effectively allowing me to hook in right after the source is compiled and before it's packaged)
def mySettings = {
// add functionality to the standard compile task
inConfig(Compile)(Seq(compile in Compile <<= (target,streams,compile in Compile) map{
(targetDirectory, taskStream, analysis) =>
import taskStream.log
// this runs after compile but before package-bin
recursiveListFiles(targetDirectory, ".*javax.*".r) foreach {
file =>
log.warn("deleting matched resource: " + file.getAbsolutePath())
IO.delete(file)
}
analysis
})) ++
Seq(name := "MyProject", version := "1.0", exportJars := true)
}
def recursiveListFiles(f: File, r: Regex): Array[File] = {
val these = f.listFiles
val good = these.filter(f => r.findFirstIn(f.getName).isDefined)
good ++ these.filter(_.isDirectory).flatMap(recursiveListFiles(_, r))
}
Its a little bit more complicated than what I had hoped but it allows me to do all sorts of modifications prior to packaging (in this case searching the target folder deleting all class files that matches a regular expression). Also it accomplished my second goal of sticking with the default SBT lifecycle.