How do I exclude package from publishing with sbt? - scala

I want to publish a library, which has some usage examples in runnable classes. When I call sbt run , it finds them and asks me, which of the main classes found I want, and then launches it. That's neat, I'd like this behaviour to stay. But those examples complicate my Android build ( more proguard configs ), so I don't want them in published artefacts.
For now, I totally exclude them, putting this into build.sbt :
excludeFilter in Compile ~= { _ ||
new FileFilter {
def accept(f: File) = f.getPath.containsSlice("/examples/")
} }
then, when I run sbt publish-local, I get jars without examples, but then one can't get the library source and see how it works, with just typing sbt run. How can I exclude examples package only from publishing, but let it still be compiled for local runs?

I'd recommend splitting examples into another subproject instead.

Related

How can I get the build output directory inside tests run by sbt?

I need to create directories and files for some tests. My project uses sbt as the build tool, and common practice is to use File.createTempFile or similar APIs, but I abhor that practice. I want all files created by my tests to reside somewhere inside the output directory (<module>/target/), so that they'll be removed when I run clean, but otherwise preserved if I have need of them to figure out test failures.
The test framework is not relevant: if your solution requires a particular framework, I'll happily adopt it or figure out how it does the trick and use that.
In short, I need the answer to one of these two questions:
How can I create a file inside the build output directory from a test run by sbt?
How can I find out what is the build output directory for the current project from a test run by sbt?
In ScalaTest, try passing target
settingKey[File]("Main directory for files generated by the build.")
to config map as -Dkey=value. For example, in build.sbt specify
Test / testOptions += Tests.Argument(s"-DtargetDir=${target.value}")
and then define test like so
import org.scalatest._
class ExampleSpec extends fixture.FlatSpec with fixture.ConfigMapFixture with Matchers {
"The config map" should "contain target directory used by sbt" in { configMap =>
configMap should contain key "targetDir"
}

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

sbt subproject aggregation and dependency behavior

I have an sbt project with a few subprojects, each of which publishes some artifacts and has a fairly extensive test suite.
When I run the build on my CI server, I want to publish the artifacts to a staging location and run the tests after the publishing task. Since others may want the artifacts, I'd like to tell sbt that I want it to build all the artifacts for all subprojects, then run all the tests, since by default it seems to run them interleaved in an unspecified order.
I have a ScopeFilter giving me access to all my subprojects, so I can make my ciBuild task depend on something like the following
(test in Test).all(subprojectScopeFilter).dependsOn(myArtifactsTask.all(subprojectScopeFilter))`
However, that doesn't seem to have any real effect on the order, and I definitely see some subprojects running tests before others have run their myArtifactsTask. I'm guessing that I don't fully understand how all works and it might be saying that each independent subproject's test task depends on that same subproject's myArtifactsTask? If that's the case, how can I specify what I want? Is it documented somewhere that I've missed? The manual describes the basics of all but not how it interacts with other constructs.
SBT will resolve automatically the order between task and projects and build them in that order.
What you could do is - let's assume you have three projects. Root and two sub-projects. I assume that the key myArtifactTask is defined in the root.
project/Build.scala
object MyBuild extends Build {
val myArtifactTask = TaskKey[Unit]("my-artifact-task", "My Artifact Task")
}
The myArtifactTask is implemented in both sub-projects.
subproject-a/build.sbt
myArtifactTask := {
println("myArtifactTask:project-a")
}
subproject-a/build.sbt
myArtifactTask := {
println("myArtifactTask:project-b")
}
What you want to do is to define your root's build.sbt in a way that it calls myArtifactTask in both projects. Then you could define new task testedArtifact which would depend on myArtifactTask.
build.sbt
lazy val testedArtifact = taskKey[Unit]("Runs myArtifactTask followed by tests")
lazy val inAnyProjectButRoot: ScopeFilter = ScopeFilter (
inAnyProject -- inProjects(ThisProject)
)
myArtifactTask := {
myArtifactTask.all(inAnyProjectButRoot).value
}
testedArtifact := {
(test in Test).all(anyProjectButRoot).value
}
testedArtifact <<= testedArtifact.dependsOn(myArtifactTask)
Now calling testedArtifactin the root project will first call all myArtifactTasks in sub-projects followed by tests.

How to "include" a common sbt snippet in another sbt file

Let's say I have a common snippet of statements that I find myself having in many projects. Is there a way to "include" a shared sbt snippet inside another (without writing a plugin)?
e.g.
Snippet (common-mapping.sbt)
mappings in Universal ++= {
for (f <- (baseDirectory.value ** "*-prod.conf").get) yield {
f -> f.getName.replaceAll( """(\w+)-prod\.conf""", "$1.conf")
}
}.toSeq
Project1's build.sbt
...
include("path/to/common-mapping.sbt")
...
Project2's (build.sbt)
...
include("path/to/common-mapping.sbt")
...
Is there a way to do so? or do I need to write a plugin?
p.s. the projects are not necessarily part of the same root project
Plugin is designed to solve this problem, so it's the way to go. Plugins are basically a JAR library that are designed to be used for the builds, and not much else. Also take a look at auto plugins that'll be out in 0.13.5.

Filtering packages from unmanagedSources in SBT

This should be very easy, but I am missing something. I apologize for the too-basic question.
I am reorganizing some code. I'd like to get the main package fixed, and then I'll have to modify code in some packages that depend upon the main package. Temporarily, I'd like for those dependent packages not to try to compile in my sbt ~compile world.
I know there exists a setting, excludeFilter in Compile in unmanagedSources, but I don't know what syntax I should use to keep whatever default exclusions are there but to add an new exclusions for (deeply nested) source directories that correspond to dependent packages.
Many thanks for any help!
Here's a working example that excludes anything with any parent directory named foo:
Compile / unmanagedSources / excludeFilter ~= { _ ||
new FileFilter {
def accept(f: File) = f.getPath.containsSlice("/foo/")
} }
(Updated to use sbt 1 style syntax.)