I have a project where part of the sources are generated (sourceGenerators in Compile). I noticed that (in most scenarios reasonably) these sources are not published with publishLocal or publishSigned. In this case this is unfortunate because when you use this project/library as a dependency, you cannot look up the sources, for example in IntelliJ, even if the other sources of the project have been downloaded.
Can I configure sbt's publishing settings to include the generated sources in the Maven -sources.jar?
So, just to be complete, this was my solution based on #pfn's answer:
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
}
mappings in (Compile,packageSrc) := (managedSources in Compile).value map (s => (s,s.getName)),
Just like #0__'s answer, but ported to the 'new' sbt syntax, i.e. without deprecation warnings.
Compile/packageSrc/mappings ++= {
val base = (Compile/sourceManaged).value
val files = (Compile/managedSources).value
files.map(f => (f, f.relativeTo(base).get.getPath))
}
Related
I have a project that consists of multiple smaller projects, some with dependencies upon each other, for example, there is a utility project that depends upon commons project.
Other projects may or may not depend upon utilities or commons or neither of them.
In the build.sbt I have the assembly merge strategy at the end of the file, along with the tests in assembly being {}.
My question is: is this correct, should each project have its own merge strategy and if so, will the others that depend on it inherit this strategy from them? Having the merge strategy contained within all of the project definitions seems clunky and would mean a lot of repeated code.
This question applied to the tests as well, should each project have the line for whether tests should be carried out or not, or will that also be inherited?
Thanks in advance. If anyone knows of a link to a sensible (relatively complex) example that'd also be great.
In my day job I currently work on a large multi-project. Unfortunately its closed source so I can't share specifics, but I can share some guidance.
Create a rootSettings used only by the root/container project, since it usually isn't part of an assembly or publish step. It would contain something like:
lazy val rootSettings := Seq(
publishArtifact := false,
publishArtifact in Test := false
)
Create a commonSettings shared by all the subprojects. Place the base/shared assembly settings here:
lazy val commonSettings := Seq(
// We use a common directory for all of the artifacts
assemblyOutputPath in assembly := baseDirectory.value /
"assembly" / (name.value + "-" + version.value + ".jar"),
// This is really a one-time, global setting if all projects
// use the same folder, but should be here if modified for
// per-project paths.
cleanFiles <+= baseDirectory { base => base / "assembly" },
test in assembly := {},
assemblyMergeStrategy in assembly := {
case "BUILD" => MergeStrategy.discard
case "logback.xml" => MergeStrategy.first
case other: Any => MergeStrategy.defaultMergeStrategy(other)
},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { _.data.getName.matches(".*finatra-scalap-compiler-deps.*") }
}
)
Each subproject uses commonSettings, and applies project-specific overrides:
lazy val fubar = project.in(file("fubar-folder-name"))
.settings(commonSettings: _*)
.settings(
// Project-specific settings here.
assemblyMergeStrategy in assembly := {
// The fubar-specific strategy
case "fubar.txt" => MergeStrategy.discard
case other: Any =>
// Apply inherited "common" strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(other)
}
)
.dependsOn(
yourCoreProject,
// ...
)
And BTW, if using IntelliJ. don't name your root project variable root, as this is what appears as the project name in the recent projects menu.
lazy val myProjectRoot = project.in(file("."))
.settings(rootSettings: _*)
.settings(
// ...
)
.dependsOn(
// ...
)
.aggregate(
fubar,
// ...
)
You may also need to add a custom strategy for combining reference.conf files (for the Typesafe Config library):
val reverseConcat: sbtassembly.MergeStrategy = new sbtassembly.MergeStrategy {
val name = "reverseConcat"
def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] =
MergeStrategy.concat(tempDir, path, files.reverse)
}
assemblyMergeStrategy in assembly := {
case "reference.conf" => reverseConcat
case other => MergeStrategy.defaultMergeStrategy(other)
}
I have a Build with two projects in it.
I want to make the root project classpath dependent on subProject, but only in certain configuration. Simplified project's config:
Subproject:
object HttpBuild{
import Dependencies._
lazy val http: Project = Project(
"http",
file("http"),
settings =
CommonSettings.settings ++
Seq(
version := "0.2-SNAPSHOT",
crossPaths := false,
libraryDependencies ++= akkaActor +: spray) ++
Packaging.defaultPackageSettings
)}
Root:
object RootBuild extends Build {
import HttpBuild._
lazy val http = HttpBuild.http
lazy val MyConfig = config("myconfig") extend Compile
private val defaultSettings = Defaults.coreDefaultSettings
lazy val api = Project("root", file("."))
.configs(MyConfig)
.settings(defaultSettings: _*)
.dependsOn(HttpBuild.http % MyConfig)
}
Now if i type myconfig:compile i want to have my root project compiled with subproject, but it doesn't seem to happen.
If i leave dependencies like this dependsOn(HttpBuild.http), it compiles, but it happens every time, no matter which configuration i use.
Have you looked at this example. I'm not an expert here, but comparing with your code above, the difference seems to be
that a CustomCompile configuration is defined and used as classpathConfiguration in Common := CustomCompile
that the dependency is indirect http % "compile->myconfig"
Perhaps try to get closer to that example.
I'm attempting to run sbt test:doc and I'm seeing a number of warnings similar to below:
[warn] /Users/tleese/code/my/stuff/src/test/scala/com/my/stuff/common/tests/util/NumberExtractorsSpecs.scala:9: Could not find any member to link for "com.my.stuff.common.util.IntExtractor".
The problem appears to be that Scaladoc references from test sources to main sources are not able to link correctly. Any idea what I might be doing wrong or need to configure?
Below are the relevant sections of my Build.scala:
val docScalacOptions = Seq("-groups", "-implicits", "-external-urls:[urls]")
scalacOptions in (Compile, doc) ++= docScalacOptions
scalacOptions in (Test, doc) ++= docScalacOptions
autoAPIMappings := true
Not sure if this is a satisfactory solution, but...
Scaladoc currently expects pairs of jar and URL to get the external linking to work. You can force sbt to link internal dependencies using JARs using exportJars. Compare the value of
$ show test:fullClasspath
before and after setting exportJars. Next, grab the name of the JAR that's being used and link it to the URL you'll be uploading it to.
scalaVersion := "2.11.0"
autoAPIMappings := true
exportJars := true
scalacOptions in (Test, doc) ++= Opts.doc.externalAPI((
file(s"${(packageBin in Compile).value}") -> url("http://example.com/")) :: Nil)
Now I see that test:doc a Scaladoc with links to http://example.com/index.html#foo.IntExtractor from my foo.IntExtractor.
Using ideas from Eugene's answer I made a following snippet.
It uses apiMapping sbt variable as adviced in sbt manual.
Unfortunately it doesn't tell how to deal with managed dependencies, even the subsection title says so.
// External documentation
/* You can print computed classpath by `show compile:fullClassPath`.
* From that list you can check jar name (that is not so obvious with play dependencies etc).
*/
val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar (it's probably somewhere under ~/.ivy/cache) from computed classpath
val classpath = (fullClasspath in Compile).value
def findJar(name: String): File = {
val regex = ("/" + name + "[^/]*.jar$").r
classpath.find { jar => regex.findFirstIn(jar.data.toString).nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)
This is a modification of the answer by #phadej. Unfortunately, that answer only works on Unix/Linux because it assumes that the path separator is a /. On Windows, the path separator is \.
The following works on all platforms, and is slightly more idiomatic IMHO:
/* You can print the classpath with `show compile:fullClassPath` in the SBT REPL.
* From that list you can find the name of the jar for the managed dependency.
*/
lazy val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar from the classpath
val classpath = (fullClasspath in Compile).value
def findJar(nameBeginsWith: String): File = {
classpath.find { attributed: Attributed[java.io.File] => (attributed.data ** s"$nameBeginsWith*.jar").get.nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)
I have a build.scala file that has a section that looks something like the clip below.
I use sbt-assembly to build a jar file of all dependent libs for deployment.
This builds fine. My problem is that I run 'assembly' and that builds ~16MB core-deps file, then I run 'package' trying to build the core.jar file. It builds core.jar--but then it overwrites my core-deps.jar file with an empty file (because core-deps has no code of its own).
How can I build both core.jar and core-deps.jar and not have 'package' blow away core-deps.jar?
lazy val deps = Project("core-deps", file("."),
settings = basicSettings ++
sbtassembly.Plugin.assemblySettings ++
Seq(assemblyOption in assembly ~= { _.copy(includeScala = false) }) ++
addArtifact(Artifact("core-deps", "core-deps"), sbtassembly.Plugin.AssemblyKeys.assembly) ++
Seq(
libraryDependencies ++=
// Master list of all used libraries so it gets added to the deps.jar file when you run assembly
compile(commons_exec, commons_codec, commons_lang, casbah, googleCLHM, joda_time, scalajack, spray_routing, spray_can, spray_client, spray_caching, akka_actor, akka_cluster, akka_slf4j, prettytime, mongo_java, casbah_gridfs, typesafe_config, logback),
jarName in assembly <<= (scalaVersion, version) map { (scalaVersion, version) => "core-deps_" + scalaVersion.dropRight(2) + "-" + version + ".jar" }
)) aggregate(core)
lazy val core = project
.settings(basicSettings: _*)
.settings(buildSettings: _*)
.settings(libraryDependencies ++=
compile(commons_exec, prettytime, commons_codec, casbah, googleCLHM, scalajack, casbah_gridfs, typesafe_config, spray_routing, spray_client, spray_can, spray_caching, akka_actor, akka_slf4j, akka_cluster, logback) ++
test(scalatest, parboiled, spray_client)
)
Why not use assemblyPackageDependency task that comes with sbt-assembly? See Excluding Scala library, your project, or deps JARs.
If for some reason you really want to disable package task in core-deps project, you could try rewiring the packageBin:
packageBin := (outputPath in assembly).value
That will do nothing but return the file name.
In SBT, if I have a task that is supposed to generate a zip/jar/war containing a bunch of files, I'd use the Defaults.packageTaskSettings method to set up that task. It'd look as follows:
object BuildDef extends Build {
val makeThings = TaskKey[File]("make-things")
val defaultMakeSettings = (baseDirectory) map { base => Seq(
(base / "thingA") -> "thingy",
(base / "thingB") -> "thingz"
)}
val project = Project("stuff", file("."))
.settings(Defaults.packageTaskSettings(makeThings, defaultMakeSettings): _*)
.settings(
artifact in makeThings <<= moduleName{ Artifact(_, "zip", "zip") }
)
}
That works just fine, and generates stuff_2.9.2-0.1-SNAPSHOT.zip in target folder.
Now I want to make an alternate version of the make-things task, that runs in a different scope, e.g. run proguard and then package things slightly differently. I've added the following settings to the BuildDef object:
val Scope = config("scope")
val project = ...
.settings(...)
.settings(
Defaults.packageTaskSettings(makeThings in Scope, defaultMakeSettings): _*
)
.settings(
artifact in (Scope, makeThings) <<=
moduleName{ n => Artifact(n+".scoped", "zip", "zip") }
)
When I run scope:make-things it seems to ignore that setting and use the old one:
> show scope:make-things
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
Why is it ignoring my settings? I hoped it'd have been generating stuff.scoped_2.9.2-0.1-SNAPSHOT.zip instead.
For more info...
> show scope:make-things::artifact
[info] Artifact(stuff.scoped,zip,zip,None,List(),None,Map())
> show scope:make-things::artifact-path
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
I realize that I could probably directly change artifactPath, but I am going off of what the xsbt-web-plugin does for its package-war task, and it doesn't touch the artifactPath. I'd like to do this the "right" way.
I ended up figuring this out almost as soon as I posted the question. The key was using the inConfig method to wrap the package settings, like this:
.settings(
artifact in (Scope, makeThings) <<= moduleName{Artifact(_,"zip","zip")}
)
.settings(
inConfig(Scope){
Defaults.packageTaskSettings(makeThings, defaultMakeSettings)
}: _*
)
I also discovered that the packageTaskSettings will modify my artifact by appending the name of the config, as long as I specify my artifact setting before the packageTaskSettings. Now I get an artifact path of
...target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT-scope.zip