This should be very easy, but I am missing something. I apologize for the too-basic question.
I am reorganizing some code. I'd like to get the main package fixed, and then I'll have to modify code in some packages that depend upon the main package. Temporarily, I'd like for those dependent packages not to try to compile in my sbt ~compile world.
I know there exists a setting, excludeFilter in Compile in unmanagedSources, but I don't know what syntax I should use to keep whatever default exclusions are there but to add an new exclusions for (deeply nested) source directories that correspond to dependent packages.
Many thanks for any help!
Here's a working example that excludes anything with any parent directory named foo:
Compile / unmanagedSources / excludeFilter ~= { _ ||
new FileFilter {
def accept(f: File) = f.getPath.containsSlice("/foo/")
} }
(Updated to use sbt 1 style syntax.)
Related
I have a build setup where I have multiple groups of dependent modules. I wrote a function which produces one group of modules:
def group(id: String) = {
val module1 = project.in(s"core/$id")...
val module2 = project.in(s"impl/$id").dependsOn(module1)...
(module1, module2)
}
I would now like to declare them:
val (core2014, impl2014) = group("2014")
This does not appear to work in build.sbt:
Pattern matching in val statements is not supported
I tried moving it into project/build.scala, where it gets compiled fine, but the modules don't appear in the SBT prompt. (That is, typing core2014/compile gives not a valid key.)
Is there any way I can add modules to the build "manually", instead of relying on the autodetection of SBT?
I'm going to guess the answer is "no" for build.sbt.
But you can redefine projects in your project/Build.scala
I want to publish a library, which has some usage examples in runnable classes. When I call sbt run , it finds them and asks me, which of the main classes found I want, and then launches it. That's neat, I'd like this behaviour to stay. But those examples complicate my Android build ( more proguard configs ), so I don't want them in published artefacts.
For now, I totally exclude them, putting this into build.sbt :
excludeFilter in Compile ~= { _ ||
new FileFilter {
def accept(f: File) = f.getPath.containsSlice("/examples/")
} }
then, when I run sbt publish-local, I get jars without examples, but then one can't get the library source and see how it works, with just typing sbt run. How can I exclude examples package only from publishing, but let it still be compiled for local runs?
I'd recommend splitting examples into another subproject instead.
I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.
Let's say I have a common snippet of statements that I find myself having in many projects. Is there a way to "include" a shared sbt snippet inside another (without writing a plugin)?
e.g.
Snippet (common-mapping.sbt)
mappings in Universal ++= {
for (f <- (baseDirectory.value ** "*-prod.conf").get) yield {
f -> f.getName.replaceAll( """(\w+)-prod\.conf""", "$1.conf")
}
}.toSeq
Project1's build.sbt
...
include("path/to/common-mapping.sbt")
...
Project2's (build.sbt)
...
include("path/to/common-mapping.sbt")
...
Is there a way to do so? or do I need to write a plugin?
p.s. the projects are not necessarily part of the same root project
Plugin is designed to solve this problem, so it's the way to go. Plugins are basically a JAR library that are designed to be used for the builds, and not much else. Also take a look at auto plugins that'll be out in 0.13.5.
I am converting a legacy jar project to SBT and for strange reasons that are not easily solved, this project comes with "javax/servlet/Servlet.class" inside it. So I need to somehow exclude this class from the jar file generated by package-bin. How do I accomplish this ?. Preferably I would like to exclude using a wildcard (i.e. javax.*).
The SBT assembly plugin does look like it has features that will do this, but I am worried that relying on sbt assembly means that my jar project will not work in a muliti module project (i.e. if I include it as a dependency in a war file then the war projects needs to be told to run assembly on the dependent jar project rather than package-bin - but I may be mistaken here).
Each task declares the other tasks and settings that it uses. You can use inspect to determine these inputs as described on Inspecting Settings and in a recent tutorial-style blog post by John Cheng.
In this case, the relevant task used by packageBin is mappings. The mappings task collects the files to be included in the jar and maps them to the path in the jar. Some background is explained on Mapping Files, but the result is that mappings produces a value of type Seq[(File, String)]. Here, the File is the input file providing the content and the String is the path in the jar.
So, to modify the mappings for the packageBin task, filter out the paths from the default mappings that you don't want to include:
mappings in (Compile,packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath != "javax/servlet/Servlet.class"
}
}
mappings in (Compile,packageBin) selects the mappings for the main package task (as opposed to test sources or the packageSrc task).
x ~= f means "set x to the result of applying function f to the previous value of x". (See More About Settings for details.)
The filter drops all pairs where the path corresponds to the Servlet class.
I came up with this solution, it defines a new compile task which depends on the previous compile task (thus effectively allowing me to hook in right after the source is compiled and before it's packaged)
def mySettings = {
// add functionality to the standard compile task
inConfig(Compile)(Seq(compile in Compile <<= (target,streams,compile in Compile) map{
(targetDirectory, taskStream, analysis) =>
import taskStream.log
// this runs after compile but before package-bin
recursiveListFiles(targetDirectory, ".*javax.*".r) foreach {
file =>
log.warn("deleting matched resource: " + file.getAbsolutePath())
IO.delete(file)
}
analysis
})) ++
Seq(name := "MyProject", version := "1.0", exportJars := true)
}
def recursiveListFiles(f: File, r: Regex): Array[File] = {
val these = f.listFiles
val good = these.filter(f => r.findFirstIn(f.getName).isDefined)
good ++ these.filter(_.isDirectory).flatMap(recursiveListFiles(_, r))
}
Its a little bit more complicated than what I had hoped but it allows me to do all sorts of modifications prior to packaging (in this case searching the target folder deleting all class files that matches a regular expression). Also it accomplished my second goal of sticking with the default SBT lifecycle.