SBT: Exclude module from aggregates or compilation based on Scala version - scala

Having a bit of trouble with the following scenario, where one of the sub-modules in the aggregate list is an SBT plugin, but the aggregate project is cross built for 2.10.x and 2.11.x.
The exclusion logic works well in general, except for any command that beings with a +. This looks like an SBT bug, as when the following block is triggered with say +publish, the compilation and doc generation in the SBT plugin fails for Scala 2.11, but it shouldn't be included in the first place.
Somehow SBT ignores that, and I've failed to find a different solution for said problem.
How can one define either an exclusion filter or module exclusion conditional on the Scala version being 2.10.x in such a way that it doesn't cause problems with + commands?
lazy val sbtPlugin = (project in file("projectx-sbt"))
.settings(sharedSettings: _*)
.settings(
scalaVersion := "2.10.6",
publish := {
CrossVersion.partialVersion(scalaVersion.value).map {
case (2, scalaMajor) if scalaMajor >= 11 => false
case _ => true
}
},
publishMavenStyle := false,
excludeFilter := {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, scalaMajor)) if scalaMajor >= 11 => NothingFilter
case _ => AllPassFilter
}
},
sbtPlugin := true,
..
)

Related

get SBT settings from ModuleID

How can I use a moduleID: ModuleID for a "sibling" project to access settings keys?
I'm writing an SBT plugin for multi-module builds.
I have project A (which dependsOn B) and project B.
Both projects have my-own generate and mybuild tasks as settings keys.
The mybuild task consumes the value from generate - this works fine.
B doesn't depend upon anything, so B's mybuild only needs the key for B:generate and all is well.
I want A's mybuild to consume both A:generate and B:generate based on the fact that A dependsOn B in the build.sbt file.
The only promising key(s) I've found return the projects as : ModuleID instances, so is there some way to get a list of settings keys from a ModuleID?
... or should I be doing this another way?
Solution (Kind of)
Whth #himos help this ...
(myTaskKey in myConfig) := {
loadedBuild.value.allProjectRefs.find(_._1 == thisProjectRef.value).map(_._2) match {
case Some(myCurrentProject) =>
if (myCurrentProject.dependencies.nonEmpty)
sys.error {
myCurrentProject.dependencies
.map {
myDependsOnProject: ClasspathDep[ProjectRef] =>
(myDependsOnProject.project / myConfig / myTaskKey).value
// https://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
}
.foldLeft("mine.dependencies:")(_ + "\n\t" + _)
}
}
}
... sort of works.
It causes an error that implies I've accessed the correct object, even if the SBT macros don't like it.
I think ModuleID that you mention relates to dependency management, not sub projects.
For taking sub project setting/task keys project scope can be used:
(generate in A).value
(generate in B).value
More comprehensive example:
name := "A"
version := "1.0"
scalaVersion := "2.12.5"
val generate = TaskKey[String]("generate")
val myBuild = TaskKey[String]("myBuild")
val a = (project in file(".")).settings(Seq(
generate := "A_generate"
))
val b = (project in file("proj_b")).settings(Seq(
generate := "B_generate",
myBuild := (generate in a).value + "_" + generate.value
)).dependsOn(a)
Sbt console output:
sbt:A> show b/myBuild
[info] A_generate_B_generate

Using .value in shared setting definition in SBT / Sane way to organize settings based on Scala version

I want to provide a multiple settings based on Scala binary version.
Those settings would be shared between several projects.
Something like:
lazy val akaneSettings = Def.settings(
organization := "ws.kotonoha",
moduleName := "akane",
crossScalaVersions := Seq("2.11.12", "2.12.4"),
scalaVersion := "2.11.12",
version := "0.2-SNAPSHOT",
javacOptions ++= Seq("-encoding", "utf8"),
scalacOptions ++= Seq(
"-feature",
"-deprecation"
),
scalaBinaryVersion.value match {
case "2.11" =>
Def.settings(
scalacOptions ++= Seq(
"-Ybackend:GenBCode",
"-Yopt:l:classpath",
"-Yopt-warnings",
"-target:jvm-1.8"
),
libraryDependencies ++= Seq("org.scala-lang.modules" % "scala-java8-compat_2.11" % "0.8.0")
)
case "2.12" =>
Def.settings(
scalacOptions ++= Seq(
"-opt:l:classpath"
)
)
case _ => throw new Exception("Not supported yet")
}
}
)
Unfortunately, the pattern match on .value does not work: it requires that I use it within a macro context.
Of course I can do the branching logic for each individual setting and use := / ++=, but that will leave a mess.
Is there a way to sanely organize groups of settings based on Scala version?
You need to move your conditional checks to inside the settings definitions, and not generate multiple settings from outside.
SBT's syntax might give the impression that you're updating values in a mutable fashion, such as by using the := operator, but you're not. Every single setting transformation is stored to be composed and applied later. At the point where akaneSettings is defined the value of scalaBinaryVersion is not known (and may actually be different depending on the context being evaluated).
Your example should look somewhat like:
// No need to use Def.Setting in recent SBT versions
lazy val akaneSettings = Seq(
organization := "ws.kotonoha",
// ...,
scalacOptions ++= {
scalaBinaryVersion.value match {
case "2.11" =>
Seq("-some-2.11-setting")
case "2.12" =>
Seq("-some-2.12-setting")
case _ =>
sys.error("Only Scala 2.11 and 2.12 are supported")
}
},
libraryDependencies ++= {
scalaBinaryVersion.value match {
case "2.11" =>
Seq("org.scala-lang.modules" % "scala-java8-compat_2.11" % "0.8.0")
case "2.12" =>
Seq.empty
case _ =>
sys.error("Only Scala 2.11 and 2.12 are supported")
}
}
}
You can create functions to generate your settings. For example:
def akaneSettings(scalaBinaryVersion: String) = Def.settings(
...
scalaBinaryVersion match {
...
}
)
... and then using it as akaneSettings(scalaBinaryVersion.value).

SBT - Multi project merge strategy and build sbt structure when using assembly

I have a project that consists of multiple smaller projects, some with dependencies upon each other, for example, there is a utility project that depends upon commons project.
Other projects may or may not depend upon utilities or commons or neither of them.
In the build.sbt I have the assembly merge strategy at the end of the file, along with the tests in assembly being {}.
My question is: is this correct, should each project have its own merge strategy and if so, will the others that depend on it inherit this strategy from them? Having the merge strategy contained within all of the project definitions seems clunky and would mean a lot of repeated code.
This question applied to the tests as well, should each project have the line for whether tests should be carried out or not, or will that also be inherited?
Thanks in advance. If anyone knows of a link to a sensible (relatively complex) example that'd also be great.
In my day job I currently work on a large multi-project. Unfortunately its closed source so I can't share specifics, but I can share some guidance.
Create a rootSettings used only by the root/container project, since it usually isn't part of an assembly or publish step. It would contain something like:
lazy val rootSettings := Seq(
publishArtifact := false,
publishArtifact in Test := false
)
Create a commonSettings shared by all the subprojects. Place the base/shared assembly settings here:
lazy val commonSettings := Seq(
// We use a common directory for all of the artifacts
assemblyOutputPath in assembly := baseDirectory.value /
"assembly" / (name.value + "-" + version.value + ".jar"),
// This is really a one-time, global setting if all projects
// use the same folder, but should be here if modified for
// per-project paths.
cleanFiles <+= baseDirectory { base => base / "assembly" },
test in assembly := {},
assemblyMergeStrategy in assembly := {
case "BUILD" => MergeStrategy.discard
case "logback.xml" => MergeStrategy.first
case other: Any => MergeStrategy.defaultMergeStrategy(other)
},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { _.data.getName.matches(".*finatra-scalap-compiler-deps.*") }
}
)
Each subproject uses commonSettings, and applies project-specific overrides:
lazy val fubar = project.in(file("fubar-folder-name"))
.settings(commonSettings: _*)
.settings(
// Project-specific settings here.
assemblyMergeStrategy in assembly := {
// The fubar-specific strategy
case "fubar.txt" => MergeStrategy.discard
case other: Any =>
// Apply inherited "common" strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(other)
}
)
.dependsOn(
yourCoreProject,
// ...
)
And BTW, if using IntelliJ. don't name your root project variable root, as this is what appears as the project name in the recent projects menu.
lazy val myProjectRoot = project.in(file("."))
.settings(rootSettings: _*)
.settings(
// ...
)
.dependsOn(
// ...
)
.aggregate(
fubar,
// ...
)
You may also need to add a custom strategy for combining reference.conf files (for the Typesafe Config library):
val reverseConcat: sbtassembly.MergeStrategy = new sbtassembly.MergeStrategy {
val name = "reverseConcat"
def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] =
MergeStrategy.concat(tempDir, path, files.reverse)
}
assemblyMergeStrategy in assembly := {
case "reference.conf" => reverseConcat
case other => MergeStrategy.defaultMergeStrategy(other)
}

sbt: publish generated sources

I have a project where part of the sources are generated (sourceGenerators in Compile). I noticed that (in most scenarios reasonably) these sources are not published with publishLocal or publishSigned. In this case this is unfortunate because when you use this project/library as a dependency, you cannot look up the sources, for example in IntelliJ, even if the other sources of the project have been downloaded.
Can I configure sbt's publishing settings to include the generated sources in the Maven -sources.jar?
So, just to be complete, this was my solution based on #pfn's answer:
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
}
mappings in (Compile,packageSrc) := (managedSources in Compile).value map (s => (s,s.getName)),
Just like #0__'s answer, but ported to the 'new' sbt syntax, i.e. without deprecation warnings.
Compile/packageSrc/mappings ++= {
val base = (Compile/sourceManaged).value
val files = (Compile/managedSources).value
files.map(f => (f, f.relativeTo(base).get.getPath))
}

sbt test:doc Could not find any member to link

I'm attempting to run sbt test:doc and I'm seeing a number of warnings similar to below:
[warn] /Users/tleese/code/my/stuff/src/test/scala/com/my/stuff/common/tests/util/NumberExtractorsSpecs.scala:9: Could not find any member to link for "com.my.stuff.common.util.IntExtractor".
The problem appears to be that Scaladoc references from test sources to main sources are not able to link correctly. Any idea what I might be doing wrong or need to configure?
Below are the relevant sections of my Build.scala:
val docScalacOptions = Seq("-groups", "-implicits", "-external-urls:[urls]")
scalacOptions in (Compile, doc) ++= docScalacOptions
scalacOptions in (Test, doc) ++= docScalacOptions
autoAPIMappings := true
Not sure if this is a satisfactory solution, but...
Scaladoc currently expects pairs of jar and URL to get the external linking to work. You can force sbt to link internal dependencies using JARs using exportJars. Compare the value of
$ show test:fullClasspath
before and after setting exportJars. Next, grab the name of the JAR that's being used and link it to the URL you'll be uploading it to.
scalaVersion := "2.11.0"
autoAPIMappings := true
exportJars := true
scalacOptions in (Test, doc) ++= Opts.doc.externalAPI((
file(s"${(packageBin in Compile).value}") -> url("http://example.com/")) :: Nil)
Now I see that test:doc a Scaladoc with links to http://example.com/index.html#foo.IntExtractor from my foo.IntExtractor.
Using ideas from Eugene's answer I made a following snippet.
It uses apiMapping sbt variable as adviced in sbt manual.
Unfortunately it doesn't tell how to deal with managed dependencies, even the subsection title says so.
// External documentation
/* You can print computed classpath by `show compile:fullClassPath`.
* From that list you can check jar name (that is not so obvious with play dependencies etc).
*/
val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar (it's probably somewhere under ~/.ivy/cache) from computed classpath
val classpath = (fullClasspath in Compile).value
def findJar(name: String): File = {
val regex = ("/" + name + "[^/]*.jar$").r
classpath.find { jar => regex.findFirstIn(jar.data.toString).nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)
This is a modification of the answer by #phadej. Unfortunately, that answer only works on Unix/Linux because it assumes that the path separator is a /. On Windows, the path separator is \.
The following works on all platforms, and is slightly more idiomatic IMHO:
/* You can print the classpath with `show compile:fullClassPath` in the SBT REPL.
* From that list you can find the name of the jar for the managed dependency.
*/
lazy val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar from the classpath
val classpath = (fullClasspath in Compile).value
def findJar(nameBeginsWith: String): File = {
classpath.find { attributed: Attributed[java.io.File] => (attributed.data ** s"$nameBeginsWith*.jar").get.nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)