SBT artifact in custom task and scope ignored? - scala

In SBT, if I have a task that is supposed to generate a zip/jar/war containing a bunch of files, I'd use the Defaults.packageTaskSettings method to set up that task. It'd look as follows:
object BuildDef extends Build {
val makeThings = TaskKey[File]("make-things")
val defaultMakeSettings = (baseDirectory) map { base => Seq(
(base / "thingA") -> "thingy",
(base / "thingB") -> "thingz"
)}
val project = Project("stuff", file("."))
.settings(Defaults.packageTaskSettings(makeThings, defaultMakeSettings): _*)
.settings(
artifact in makeThings <<= moduleName{ Artifact(_, "zip", "zip") }
)
}
That works just fine, and generates stuff_2.9.2-0.1-SNAPSHOT.zip in target folder.
Now I want to make an alternate version of the make-things task, that runs in a different scope, e.g. run proguard and then package things slightly differently. I've added the following settings to the BuildDef object:
val Scope = config("scope")
val project = ...
.settings(...)
.settings(
Defaults.packageTaskSettings(makeThings in Scope, defaultMakeSettings): _*
)
.settings(
artifact in (Scope, makeThings) <<=
moduleName{ n => Artifact(n+".scoped", "zip", "zip") }
)
When I run scope:make-things it seems to ignore that setting and use the old one:
> show scope:make-things
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
Why is it ignoring my settings? I hoped it'd have been generating stuff.scoped_2.9.2-0.1-SNAPSHOT.zip instead.
For more info...
> show scope:make-things::artifact
[info] Artifact(stuff.scoped,zip,zip,None,List(),None,Map())
> show scope:make-things::artifact-path
[info] ...\target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT.zip
I realize that I could probably directly change artifactPath, but I am going off of what the xsbt-web-plugin does for its package-war task, and it doesn't touch the artifactPath. I'd like to do this the "right" way.

I ended up figuring this out almost as soon as I posted the question. The key was using the inConfig method to wrap the package settings, like this:
.settings(
artifact in (Scope, makeThings) <<= moduleName{Artifact(_,"zip","zip")}
)
.settings(
inConfig(Scope){
Defaults.packageTaskSettings(makeThings, defaultMakeSettings)
}: _*
)
I also discovered that the packageTaskSettings will modify my artifact by appending the name of the config, as long as I specify my artifact setting before the packageTaskSettings. Now I get an artifact path of
...target\scala-2.9.2\stuff_2.9.2-0.1-SNAPSHOT-scope.zip

Related

Different project on the same base-directory (or share a single file between projects)

My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)

sbt: generating shared sources in cross-platform project

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?
Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

get SBT settings from ModuleID

How can I use a moduleID: ModuleID for a "sibling" project to access settings keys?
I'm writing an SBT plugin for multi-module builds.
I have project A (which dependsOn B) and project B.
Both projects have my-own generate and mybuild tasks as settings keys.
The mybuild task consumes the value from generate - this works fine.
B doesn't depend upon anything, so B's mybuild only needs the key for B:generate and all is well.
I want A's mybuild to consume both A:generate and B:generate based on the fact that A dependsOn B in the build.sbt file.
The only promising key(s) I've found return the projects as : ModuleID instances, so is there some way to get a list of settings keys from a ModuleID?
... or should I be doing this another way?
Solution (Kind of)
Whth #himos help this ...
(myTaskKey in myConfig) := {
loadedBuild.value.allProjectRefs.find(_._1 == thisProjectRef.value).map(_._2) match {
case Some(myCurrentProject) =>
if (myCurrentProject.dependencies.nonEmpty)
sys.error {
myCurrentProject.dependencies
.map {
myDependsOnProject: ClasspathDep[ProjectRef] =>
(myDependsOnProject.project / myConfig / myTaskKey).value
// https://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
}
.foldLeft("mine.dependencies:")(_ + "\n\t" + _)
}
}
}
... sort of works.
It causes an error that implies I've accessed the correct object, even if the SBT macros don't like it.
I think ModuleID that you mention relates to dependency management, not sub projects.
For taking sub project setting/task keys project scope can be used:
(generate in A).value
(generate in B).value
More comprehensive example:
name := "A"
version := "1.0"
scalaVersion := "2.12.5"
val generate = TaskKey[String]("generate")
val myBuild = TaskKey[String]("myBuild")
val a = (project in file(".")).settings(Seq(
generate := "A_generate"
))
val b = (project in file("proj_b")).settings(Seq(
generate := "B_generate",
myBuild := (generate in a).value + "_" + generate.value
)).dependsOn(a)
Sbt console output:
sbt:A> show b/myBuild
[info] A_generate_B_generate

SBT - Multi project merge strategy and build sbt structure when using assembly

I have a project that consists of multiple smaller projects, some with dependencies upon each other, for example, there is a utility project that depends upon commons project.
Other projects may or may not depend upon utilities or commons or neither of them.
In the build.sbt I have the assembly merge strategy at the end of the file, along with the tests in assembly being {}.
My question is: is this correct, should each project have its own merge strategy and if so, will the others that depend on it inherit this strategy from them? Having the merge strategy contained within all of the project definitions seems clunky and would mean a lot of repeated code.
This question applied to the tests as well, should each project have the line for whether tests should be carried out or not, or will that also be inherited?
Thanks in advance. If anyone knows of a link to a sensible (relatively complex) example that'd also be great.
In my day job I currently work on a large multi-project. Unfortunately its closed source so I can't share specifics, but I can share some guidance.
Create a rootSettings used only by the root/container project, since it usually isn't part of an assembly or publish step. It would contain something like:
lazy val rootSettings := Seq(
publishArtifact := false,
publishArtifact in Test := false
)
Create a commonSettings shared by all the subprojects. Place the base/shared assembly settings here:
lazy val commonSettings := Seq(
// We use a common directory for all of the artifacts
assemblyOutputPath in assembly := baseDirectory.value /
"assembly" / (name.value + "-" + version.value + ".jar"),
// This is really a one-time, global setting if all projects
// use the same folder, but should be here if modified for
// per-project paths.
cleanFiles <+= baseDirectory { base => base / "assembly" },
test in assembly := {},
assemblyMergeStrategy in assembly := {
case "BUILD" => MergeStrategy.discard
case "logback.xml" => MergeStrategy.first
case other: Any => MergeStrategy.defaultMergeStrategy(other)
},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { _.data.getName.matches(".*finatra-scalap-compiler-deps.*") }
}
)
Each subproject uses commonSettings, and applies project-specific overrides:
lazy val fubar = project.in(file("fubar-folder-name"))
.settings(commonSettings: _*)
.settings(
// Project-specific settings here.
assemblyMergeStrategy in assembly := {
// The fubar-specific strategy
case "fubar.txt" => MergeStrategy.discard
case other: Any =>
// Apply inherited "common" strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(other)
}
)
.dependsOn(
yourCoreProject,
// ...
)
And BTW, if using IntelliJ. don't name your root project variable root, as this is what appears as the project name in the recent projects menu.
lazy val myProjectRoot = project.in(file("."))
.settings(rootSettings: _*)
.settings(
// ...
)
.dependsOn(
// ...
)
.aggregate(
fubar,
// ...
)
You may also need to add a custom strategy for combining reference.conf files (for the Typesafe Config library):
val reverseConcat: sbtassembly.MergeStrategy = new sbtassembly.MergeStrategy {
val name = "reverseConcat"
def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] =
MergeStrategy.concat(tempDir, path, files.reverse)
}
assemblyMergeStrategy in assembly := {
case "reference.conf" => reverseConcat
case other => MergeStrategy.defaultMergeStrategy(other)
}

sbt: publish generated sources

I have a project where part of the sources are generated (sourceGenerators in Compile). I noticed that (in most scenarios reasonably) these sources are not published with publishLocal or publishSigned. In this case this is unfortunate because when you use this project/library as a dependency, you cannot look up the sources, for example in IntelliJ, even if the other sources of the project have been downloaded.
Can I configure sbt's publishing settings to include the generated sources in the Maven -sources.jar?
So, just to be complete, this was my solution based on #pfn's answer:
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
}
mappings in (Compile,packageSrc) := (managedSources in Compile).value map (s => (s,s.getName)),
Just like #0__'s answer, but ported to the 'new' sbt syntax, i.e. without deprecation warnings.
Compile/packageSrc/mappings ++= {
val base = (Compile/sourceManaged).value
val files = (Compile/managedSources).value
files.map(f => (f, f.relativeTo(base).get.getPath))
}