How to exclude assembly from package in sbt? - scala

I have a build.scala file that has a section that looks something like the clip below.
I use sbt-assembly to build a jar file of all dependent libs for deployment.
This builds fine. My problem is that I run 'assembly' and that builds ~16MB core-deps file, then I run 'package' trying to build the core.jar file. It builds core.jar--but then it overwrites my core-deps.jar file with an empty file (because core-deps has no code of its own).
How can I build both core.jar and core-deps.jar and not have 'package' blow away core-deps.jar?
lazy val deps = Project("core-deps", file("."),
settings = basicSettings ++
sbtassembly.Plugin.assemblySettings ++
Seq(assemblyOption in assembly ~= { _.copy(includeScala = false) }) ++
addArtifact(Artifact("core-deps", "core-deps"), sbtassembly.Plugin.AssemblyKeys.assembly) ++
Seq(
libraryDependencies ++=
// Master list of all used libraries so it gets added to the deps.jar file when you run assembly
compile(commons_exec, commons_codec, commons_lang, casbah, googleCLHM, joda_time, scalajack, spray_routing, spray_can, spray_client, spray_caching, akka_actor, akka_cluster, akka_slf4j, prettytime, mongo_java, casbah_gridfs, typesafe_config, logback),
jarName in assembly <<= (scalaVersion, version) map { (scalaVersion, version) => "core-deps_" + scalaVersion.dropRight(2) + "-" + version + ".jar" }
)) aggregate(core)
lazy val core = project
.settings(basicSettings: _*)
.settings(buildSettings: _*)
.settings(libraryDependencies ++=
compile(commons_exec, prettytime, commons_codec, casbah, googleCLHM, scalajack, casbah_gridfs, typesafe_config, spray_routing, spray_client, spray_can, spray_caching, akka_actor, akka_slf4j, akka_cluster, logback) ++
test(scalatest, parboiled, spray_client)
)

Why not use assemblyPackageDependency task that comes with sbt-assembly? See Excluding Scala library, your project, or deps JARs.
If for some reason you really want to disable package task in core-deps project, you could try rewiring the packageBin:
packageBin := (outputPath in assembly).value
That will do nothing but return the file name.

Related

sbt: generating shared sources in cross-platform project

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?
Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

SBT - Multi project merge strategy and build sbt structure when using assembly

I have a project that consists of multiple smaller projects, some with dependencies upon each other, for example, there is a utility project that depends upon commons project.
Other projects may or may not depend upon utilities or commons or neither of them.
In the build.sbt I have the assembly merge strategy at the end of the file, along with the tests in assembly being {}.
My question is: is this correct, should each project have its own merge strategy and if so, will the others that depend on it inherit this strategy from them? Having the merge strategy contained within all of the project definitions seems clunky and would mean a lot of repeated code.
This question applied to the tests as well, should each project have the line for whether tests should be carried out or not, or will that also be inherited?
Thanks in advance. If anyone knows of a link to a sensible (relatively complex) example that'd also be great.
In my day job I currently work on a large multi-project. Unfortunately its closed source so I can't share specifics, but I can share some guidance.
Create a rootSettings used only by the root/container project, since it usually isn't part of an assembly or publish step. It would contain something like:
lazy val rootSettings := Seq(
publishArtifact := false,
publishArtifact in Test := false
)
Create a commonSettings shared by all the subprojects. Place the base/shared assembly settings here:
lazy val commonSettings := Seq(
// We use a common directory for all of the artifacts
assemblyOutputPath in assembly := baseDirectory.value /
"assembly" / (name.value + "-" + version.value + ".jar"),
// This is really a one-time, global setting if all projects
// use the same folder, but should be here if modified for
// per-project paths.
cleanFiles <+= baseDirectory { base => base / "assembly" },
test in assembly := {},
assemblyMergeStrategy in assembly := {
case "BUILD" => MergeStrategy.discard
case "logback.xml" => MergeStrategy.first
case other: Any => MergeStrategy.defaultMergeStrategy(other)
},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { _.data.getName.matches(".*finatra-scalap-compiler-deps.*") }
}
)
Each subproject uses commonSettings, and applies project-specific overrides:
lazy val fubar = project.in(file("fubar-folder-name"))
.settings(commonSettings: _*)
.settings(
// Project-specific settings here.
assemblyMergeStrategy in assembly := {
// The fubar-specific strategy
case "fubar.txt" => MergeStrategy.discard
case other: Any =>
// Apply inherited "common" strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(other)
}
)
.dependsOn(
yourCoreProject,
// ...
)
And BTW, if using IntelliJ. don't name your root project variable root, as this is what appears as the project name in the recent projects menu.
lazy val myProjectRoot = project.in(file("."))
.settings(rootSettings: _*)
.settings(
// ...
)
.dependsOn(
// ...
)
.aggregate(
fubar,
// ...
)
You may also need to add a custom strategy for combining reference.conf files (for the Typesafe Config library):
val reverseConcat: sbtassembly.MergeStrategy = new sbtassembly.MergeStrategy {
val name = "reverseConcat"
def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] =
MergeStrategy.concat(tempDir, path, files.reverse)
}
assemblyMergeStrategy in assembly := {
case "reference.conf" => reverseConcat
case other => MergeStrategy.defaultMergeStrategy(other)
}

sbt: publish generated sources

I have a project where part of the sources are generated (sourceGenerators in Compile). I noticed that (in most scenarios reasonably) these sources are not published with publishLocal or publishSigned. In this case this is unfortunate because when you use this project/library as a dependency, you cannot look up the sources, for example in IntelliJ, even if the other sources of the project have been downloaded.
Can I configure sbt's publishing settings to include the generated sources in the Maven -sources.jar?
So, just to be complete, this was my solution based on #pfn's answer:
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
}
mappings in (Compile,packageSrc) := (managedSources in Compile).value map (s => (s,s.getName)),
Just like #0__'s answer, but ported to the 'new' sbt syntax, i.e. without deprecation warnings.
Compile/packageSrc/mappings ++= {
val base = (Compile/sourceManaged).value
val files = (Compile/managedSources).value
files.map(f => (f, f.relativeTo(base).get.getPath))
}

How to have sbt multi-project builds configure setting for subprojects?

I have an sbt (0.13.1) project with a bunch of subprojects. I am generating eclipse project configurations using sbteclipse. My projects only have scala source files, so I want to remove the generated src/java folders.
I can achieve that by (redundantly) adding the following to the build.sbt of each subproject:
unmanagedSourceDirectories in Compile := (scalaSource in Compile).value :: Nil
unmanagedSourceDirectories in Test := (scalaSource in Test).value :: Nil
I tried just adding the above configuration to the root build.sbt but the eclipse command still generated the java source folders.
Is there any way to specify a configuration like this once (in the root build.sbt) and have it flow down to each subproject?
You could define the settings unscoped and then reuse them
val onlyScalaSources = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
val project1 = project.in( file( "project1" )
.settings(onlyScalaSources: _*)
val project2 = project.in( file( "project2" )
.settings(onlyScalaSources: _*)
You could also create a simple plugin (untested code)
object OnlyScalaSources extends AutoPlugin {
override def trigger = allRequirements
override lazy val projectSettings = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
}
More details about creating plugins in the plugins documentation

How to get list of dependency jars from an sbt 0.10.0 project

I have a sbt 0.10.0 project that declares a few dependencies somewhat like:
object MyBuild extends Build {
val commonDeps = Seq("commons-httpclient" % "commons-httpclient" % "3.1",
"commons-lang" % "commons-lang" % "2.6")
val buildSettings = Defaults.defaultSettings ++ Seq ( organization := "org" )
lazy val proj = Project("proj", file("src"),
settings = buildSettings ++ Seq(
name := "projname",
libraryDependencies := commonDeps, ...)
...
}
I wish to creat a build rule to gather all the jar dependencies of "proj", so that I can symlink them to a single directory.
Thanks.
Example SBT task to print full runtime classpath
Below is roughly what I'm using. The "get-jars" task is executable from the SBT prompt.
import sbt._
import Keys._
object MyBuild extends Build {
// ...
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Runtime) map { (target, cp) =>
println("Target path is: "+target)
println("Full classpath is: "+cp.map(_.data).mkString(":"))
}
lazy val project = Project (
"project",
file ("."),
settings = Defaults.defaultSettings ++ Seq(getJarsTask)
)
}
Other resources
Unofficial guide to sbt 0.10.
Keys.scala defines predefined keys. For example, you might want to replace fullClasspath with managedClasspath.
This plugin defines a simple command to generate an .ensime file, and may be a useful reference.