Defining multiple modules at once in the build definition - scala

I have a build setup where I have multiple groups of dependent modules. I wrote a function which produces one group of modules:
def group(id: String) = {
val module1 = project.in(s"core/$id")...
val module2 = project.in(s"impl/$id").dependsOn(module1)...
(module1, module2)
}
I would now like to declare them:
val (core2014, impl2014) = group("2014")
This does not appear to work in build.sbt:
Pattern matching in val statements is not supported
I tried moving it into project/build.scala, where it gets compiled fine, but the modules don't appear in the SBT prompt. (That is, typing core2014/compile gives not a valid key.)
Is there any way I can add modules to the build "manually", instead of relying on the autodetection of SBT?

I'm going to guess the answer is "no" for build.sbt.
But you can redefine projects in your project/Build.scala

Related

How to create new commands that you can run after running the sbt command Scala

I'm trying to run certain tasks and startup servers after running sbt. I want to be able to run commands in terminal to do this. How can I define them? Are plugins the right way to do this:
I see some code like this:
object DoThing extends AutoPlugin {
object autoImport {
val vpnCheck = taskKey[Boolean]("Check for a VPN connection.")
}
import autoImport._
override lazy val projectSettings = Seq(
vpnCheck := {
doVpnCheck()
}
)
What is the projectSettings method doing? Are plugins the way?
From the plugins page:
A plugin is a way to use external code in a build definition. A plugin can be a library used to implement a task (you might use Knockoff to write a markdown processing task). A plugin can define a sequence of sbt settings that are automatically added to all projects or that are explicitly declared for selected projects. For example, a plugin might add a proguard task and associated (overridable) settings. Finally, a plugin can define new commands (via the commands setting).
But I can't seem to figure this out.
For your scenario, maybe you can just a create Task in sbt file to do this, like:
val hello = taskKey[Unit]("hello world")
hello := {
println("hello")
}
and if you run it automatically in startup time, you can create .sbtrc file in project directory, and it like:
alias boot = ;reload ;hello ;iflast shell

Using DependsOn between two ScalaJS SBT projects

(Long question ahead. Simplified tl;dr at the bottom).
I have two ScalaJS projects built with SBT - "myapp" and "mylib", in the following directory structure
root/build.sbt
root/myapp/build.sbt
root/myapp/jvm/
root/myapp/js/
root/myapp/shared/
root/mylib/build.sbt
root/mylib/jvm
root/mylib/js
root/mylib/shared
lib exports an artifact named "com.example:mylib:0.1", which as used as a libraryDependency for myapp.
myapp and mylib are in separate repositories, contain their own build files, and should be able to be build completely separately (i.e. they must contain their own individual build config).
In production, they will be built separately with mylib being first published as a maven artifact before building myapp separately.
In development however, I want to be able to merge these into a parent SBT project so that both can be developed in parallel without needing to use publishLocal after each change.
In a traditional (not scalajs) project this would be quite easy
$ROOT/build.sbt:
lazy val mylib = project
lazy val myapp = project.dependsOn(mylib)
However in ScalaJS, we actually have two projects inside each module - appJVM, appJS, libJVM and libJS. As such, the above configuration only finds the aggregate root project and does not correctly apply the dependsOn configuration to the actual JVM and JS projects.
(i.e. myapp and mylib build.sbt each contains two projects, and an aggregate root project)
Ideally I'd like to be able to do something like the following
lazy val mylibJVM = project
lazy val myappJVM = project.dependsOn(mylibJVM)
lazy val mylibJS = project
lazy val myappJS = project.dependsOn(myappJS)
Unfortunately this just creates new projects within the root instead of importing the subprojects themselves.
I've also tried various combinations of paths (such as)
lazy val mylibJVM = project.in(file("mylib/jvm"))
But this doesn't see configuration in build.sbt file in mylib
Ultimately I keep running up against the same problem - when importing an existing multi-project SBT project into a parent sbt file, it imports the root project, but does not seem to provide a way to import a subproject from an existing multimodule SBT file in a way that lets me add dependsOn configuration to it.
tl;dr
If I have
root/mylib/build.sbt with multiple projects defined and
root/myapp/build.sbt with multiple projects defined
Is it possible to import individual subprojects into root/build.sbt instead of the root project from the submodule?
i.e. Can I have two layers of multiproject builds.
After spending a lot of time digging through SBT source code, I managed to figure out a solution. This isn't clean, but it works. (For bonus points, it imports correctly into IntelliJ).
// Add this function to your root build.sbt file.
// It can be used to define a dependency between any
// `ProjectRef` without needing a full project definition.
def addDep(from:String, to:String) = {
buildDependencies in Global <<= (
buildDependencies in Global,
thisProjectRef in from,
thisProjectRef in to) {
(deps, fromref, toref) =>
deps.addClasspath(fromref, ResolvedClasspathDependency(toref, None))
}
}
// `project` will import the `build.sbt` file
// in the subdirectory of the same name as the `lazy val`
// (performed by an SBT macro). i.e. `./mylib/build.sbt`
//
// This won't reference the actual subprojects directly,
// will but import them into the namespace such that they
// can be referenced as "ProjectRefs", which are implicitly
// converted to from strings.
//
// We then aggregate the JVM and JS ScalaJS projects
// into the new root project we've defined. (Which unfortunately
// won't inherit anything from the child build.sbt)
lazy val mylib = project.aggregate("mylibJVM","mylibJS")
lazy val myapp = project.aggregate("myappJVM","myappJS")
// Define a root project to aggregate everything
lazy val root = project.in(file(".")).aggregate(mylib,myapp)
// We now call our custom function to define a ClassPath dependency
// between `myapp` -> `mylib` for both JVM and JS subprojects.
// In particular, this will correctly find exported artifacts
// so that `myapp` can refer to `mylib` in libraryDependencies
// without needing to use `publishLocal`.
addDep("myappJVM", "mylibJVM")
addDep("myappJS","mylibJS")

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

SBT: Exclude class from Jar

I am converting a legacy jar project to SBT and for strange reasons that are not easily solved, this project comes with "javax/servlet/Servlet.class" inside it. So I need to somehow exclude this class from the jar file generated by package-bin. How do I accomplish this ?. Preferably I would like to exclude using a wildcard (i.e. javax.*).
The SBT assembly plugin does look like it has features that will do this, but I am worried that relying on sbt assembly means that my jar project will not work in a muliti module project (i.e. if I include it as a dependency in a war file then the war projects needs to be told to run assembly on the dependent jar project rather than package-bin - but I may be mistaken here).
Each task declares the other tasks and settings that it uses. You can use inspect to determine these inputs as described on Inspecting Settings and in a recent tutorial-style blog post by John Cheng.
In this case, the relevant task used by packageBin is mappings. The mappings task collects the files to be included in the jar and maps them to the path in the jar. Some background is explained on Mapping Files, but the result is that mappings produces a value of type Seq[(File, String)]. Here, the File is the input file providing the content and the String is the path in the jar.
So, to modify the mappings for the packageBin task, filter out the paths from the default mappings that you don't want to include:
mappings in (Compile,packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath != "javax/servlet/Servlet.class"
}
}
mappings in (Compile,packageBin) selects the mappings for the main package task (as opposed to test sources or the packageSrc task).
x ~= f means "set x to the result of applying function f to the previous value of x". (See More About Settings for details.)
The filter drops all pairs where the path corresponds to the Servlet class.
I came up with this solution, it defines a new compile task which depends on the previous compile task (thus effectively allowing me to hook in right after the source is compiled and before it's packaged)
def mySettings = {
// add functionality to the standard compile task
inConfig(Compile)(Seq(compile in Compile <<= (target,streams,compile in Compile) map{
(targetDirectory, taskStream, analysis) =>
import taskStream.log
// this runs after compile but before package-bin
recursiveListFiles(targetDirectory, ".*javax.*".r) foreach {
file =>
log.warn("deleting matched resource: " + file.getAbsolutePath())
IO.delete(file)
}
analysis
})) ++
Seq(name := "MyProject", version := "1.0", exportJars := true)
}
def recursiveListFiles(f: File, r: Regex): Array[File] = {
val these = f.listFiles
val good = these.filter(f => r.findFirstIn(f.getName).isDefined)
good ++ these.filter(_.isDirectory).flatMap(recursiveListFiles(_, r))
}
Its a little bit more complicated than what I had hoped but it allows me to do all sorts of modifications prior to packaging (in this case searching the target folder deleting all class files that matches a regular expression). Also it accomplished my second goal of sticking with the default SBT lifecycle.

How to create a compiler Action for SBT

I want to create an Action to automate GCJ compilation. Since I couldn't make it work with Ant, I decided to try SBT. The docs say how to create an Action and how to run an external process. What I don't yet see is how to reuse the directory tree traversal which exists for java and scala compiler Actions. In this case my input files would be all the .class files under a certain root folder. I would also need to specify a specific classpath for GCJ. Any pointers for this would be appreciated too.
I haven't used GCJ much at all and I'm still pretty new at SBT, but this is how I believe you could write a quick task to do exactly what you are looking for with SBT 0.7.1. You can use a PathFinder to grab all of the class files like so:
val allClasses = (outputPath ##) ** "*.class"
Using that PathFinder and the "compileClasspath" top level method, you can construct a task like this which will run gcj using the current project's classpath and compose all of the .class files into one gcjFile:
val gcj = "/usr/local/bin/gcj"
val gcjFile = "target/my_executable.o"
val allClasses = (outputPath ##) ** "*.class"
lazy val gcjCompile = execTask {
<x>{gcj} --classpath={compileClasspath.get.map(_.absolutePath).mkString(":")} -c {allClasses.get.map(_.absolutePath).mkString("-c ")} -o {gcjFile}</x>
} dependsOn(compile) describedAs("Create a GCJ executable object")