Multiple SBT Configurations should be exclusive, but they all activate at the same time - why? - scala

I have defined a minimal build.sbt with two custom profiles ‘dev’ and ‘staging’ (what SBT seems to call Configurations). However, when I run SBT with the Configuration that was defined first in the file (dev), both Configuration blocks are executed - and if both modify the same setting, the last one wins (staging).
This seems to break any notion of conditional activation, so what am I doing wrong with SBT?
For reference, I want to emulate the conditionally activated Profiles concept of Maven e.g. mvn test -P staging.
SBT version: 1.2.1
build.sbt:
name := "example-project"
scalaVersion := "2.12.6"
...
fork := true
// Environment-independent JVM property (always works)
javaOptions += "-Da=b"
// Environment-specific JVM property (doesn’t work)
lazy val Dev = config("dev") extend Test
lazy val Staging = config("staging") extend Test
val root = (project in file("."))
.configs(Dev, Staging)
.settings(inConfig(Dev)(Seq(javaOptions in Test += "-Dfoo=bar")))
.settings(inConfig(Staging)(Seq(javaOptions in Test += "-Dfoo=qux")))
Command:
# Bad
sbt test
=> foo=qux
a=b
# Bad
sbt clean dev:test
=> foo=qux
a=b
# Good
sbt clean staging:test
=> foo=qux
a=b

Notice that despite of the inConfig usage you're still setting javaOptions in Test, i.e. in the Test config. If you remove in Test, it works as expected:
...
.settings(inConfig(Dev)(javaOptions += "-Dfoo=bar"))
.settings(inConfig(Staging)(javaOptions += "-Dfoo=qux"))
(also Seq(...) wrapping is unnecessary)
Now in sbt:
> show Test/javaOptions
[info] *
> show Dev/javaOptions
[info] * -Dfoo=bar
> show Staging/javaOptions
[info] * -Dfoo=qux
You can achieve the same result by scoping each setting explicitly (without inConfig wrapping):
.settings(
Dev/javaOptions += "-Dfoo=bar",
Staging/javaOptions += "-Dfoo=qux",
...
)
(here Conf/javaOptions is the same as javaOptions in Conf)

Related

Clarification on scope delegation of dockerExposedPorts with sbt-native-packager

I am using the sbt-native-packager plugin that comes with Scala Play:
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.8")
I would like to know why the dockerExposedPorts setting isn't set for the root project when using:
name := """scala-play-react-seed"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file("."))
.enablePlugins(PlayScala)
.settings(
Docker / dockerExposedPorts := Seq(9000), // <-- 1. doesn't work
)
Docker / dockerExposedPorts := Seq(9000), // <-- 2. also doesn't work
$ sbt show root/dockerExposedPorts
[info] *
However, it works if i remove the Docker config part:
.settings(
dockerExposedPorts := Seq(9000), // <-- 3. works
)
$ sbt show root/dockerExposedPorts
[info] * 9000
As far is i understand sbt's scope delegation, case 1. is scoped as root / Docker / Zero / dockerExposedPorts, which should be more specific than case 3. root / Zero / Zero / dockerExposedPorts. What am i missing here?
Yes, Docker / dockerExposedPorts is more specific than Zero / dockerExposedPorts. But unlike what you seem to assume, it's the more specific scopes that delegate to the less specific ones, not the other way around. The manual says so:
This feature allows you to set a value once in a more general scope, allowing multiple more-specific scopes to inherit the value.
And in fact, this is the only way it could be, because you might define e. g. both Docker / dockerExposedPorts and Universal / dockerExposedPorts. Which one of these would Zero / dockerExposedPorts delegate to? There's no sensible answer to that, hence delegation goes in the other direction.

How to make SBT run test suites in parallel?

I have a bunch of integration tests running by sbt, given test N suites with 1..M tests per each suite.
I have set fork in IntegrationTest := true, but test suites are always executed sequentially. According to the docs, this must not be the case: test suites should be executed concurrently.
the test suites are a class as following:
class MyTestSuite1 extends FlatSpec with Matchers
...
it should "do A" {}
it should "do B" {}
class MyTestSuite2 extends FlatSpec with Matchers
...
it should "do C" {}
it should "do D" {}
the problem
MyTestSuite1 and MyTestSuiteN are executed sequentially (by the alphabet order to be exact)
expectation
MyTestSuite1 and MyTestSuiteM are executed concurrently
env
.sbopts:
-J-Xms1G
-J-Xmx4G
-J-XX:MaxMetaspaceSize=512m
-J-Xss4M
note
I noticed that all test are running using the same pool and thread, for example, pool-1-thread-1 for all tests.
sbt version: 1.2.8
Scala: 2.12.8
os: MacOS 10.15, Ubuntu 19.04
Scalatest ver: 3.2.0-SNAP10
Tried sbt v. 1.3.2 - same result.
Adding
testOptions in IntegrationTest += Tests.Argument(TestFrameworks.ScalaTest, "-P4"),
does not help.
============
Update
fork in(IntegrationTest, test) := true works on a global level, but I have 2 projects and I want to make it work to preserve relative path to the proj.
e.g.
lazy val `p1` = Project(id = "p1", base = file("./p1"))
.configs(IntegrationTest)
.settings(Defaults.itSettings: _*)
.settings(
fork in(IntegrationTest, test) := true,
...)
lazy val `p2` = Project(id = "p2", base = file("./p2"))
.configs(IntegrationTest)
.settings(Defaults.itSettings: _*)
.settings(
fork in(IntegrationTest, test) := true,
...)
does not run tests in parallel
instead, this runs in parallel, but, obviously, the home dir is set to be "." rather than to be "./p1" or "./p2" respectively:
fork in(IntegrationTest, test) := true
lazy val `p1` = Project(id = "p1", base = file("./p1"))
.configs(IntegrationTest)
.settings(Defaults.itSettings: _*)
SOLUTION:
It appears there's testForkedParallel in IntegrationTest := true option which does exactly what I needed - it spawns new JVM per test suite.
==============
Remark:
So, the only problem is that now it spawns JVMs as many as the count of all available CPUs and I can't funnel only test concurrency:
OPTION 1 - funnels all sbt processes to be only 4 in parallel
concurrentRestrictions in Global := Seq(Tags.limitAll(4))
OPTION 2 - just does nothing (test are in the subproject)
concurrentRestrictions in Global += Tags.limit(Tags.Test, 4),
By default tests executed in forked JVM are executed sequentially. Refer the following para from sbt testing docs:
The setting:
Test / fork := true
specifies that all tests will be executed in a single external JVM.
See Forking for configuring standard options for forking. By default,
tests executed in a forked JVM are executed sequentially. More control
over how tests are assigned to JVMs and what options to pass to those
is available with testGrouping key.
So, you have two options:
Do not fork JVM and your tests will run parallelly by default
If you want to fork and still rut tests parallelly go through this docs:
https://www.scala-sbt.org/1.x/docs/Testing.html#Forking+tests

Environment variables set up from build.sbt

I have been trying to set up environment variables directly from build.sbt file as I need to use an assembly jar name which is defined in this file. So I've been trying to output the result of defined environment variable using echo and in the context of the Scala application code sys.props.get("SPARK_APP_JAR_PATH") but the result is blank or None, respectively.
What can be wrong with the environment variable configuration?
This how I defined the env variable in build.sbt:
assemblyJarName in assembly := "spark_app_example_test.jar"
mainClass in assembly := Some("example.SparkAppExample")
test in assembly := {}
fork := true
val envVars = sys.props("SPARK_APP_JAR_PATH") = "/path/to/jar/file"
That's how it works for me:
lazy val dockerRepo: String = sys.props.getOrElse("DOCKER_REPO", s"bpf.docker.repo")
Or with your example:
lazy val envVars = sys.props.getOrElse("SPARK_APP_JAR_PATH", "/path/to/jar/file")
According to the documentation:
System properties can be provided either as JVM options, or as SBT arguments, in both cases as -Dprop=value. The following properties influence SBT execution.
e.g. sbt -DSPARK_APP_JAR_PATH=/path/to/jar/file
See https://www.scala-sbt.org/release/docs/Command-Line-Reference.html#Command+Line+Options

How to pass scalacOptions (Xelide-below) to sbt via command line

I am trying to call sbt assembly from the command line passing it a scalac compiler flag to elides (elide-below 1).
I have managed to get the flag working in the build.sbt by adding this line to the build.sbt
scalacOptions ++= Seq("-Xelide-below", "1")
And also it's working fine when I start sbt and run the following:
$> sbt
$> set scalacOptions in ThisBuild ++=Seq("-Xelide-below", "0")
But I would like to know how to pass this in when starting sbt, so that my CI jobs can use it while doing different assembly targets (ie. dev/test/prod).
One way to pass the elide level as a command line option is to use system properties
scalacOptions ++= Seq("-Xelide-below", sys.props.getOrElse("elide.below", "0"))
and run sbt -Delide.below=20 assembly. Quick, dirty and easy.
Another more verbose way to accomplish the same thing is to define different commands for producing test/prod artifacts.
lazy val elideLevel = settingKey[Int]("elide code below this level.")
elideLevel in Global := 0
scalacOptions ++= Seq("-Xelide-below", elideLevel.value.toString)
def assemblyCommand(name: String, level: Int) =
Command.command(s"${name}Assembly") { s =>
s"set elideLevel in Global := $level" ::
"assembly" ::
s"set elideLevel in Global := 0" ::
s
}
commands += assemblyCommand("test", 10)
commands += assemblyCommand("prod", 1000)
and you can run sbt testAssembly prodAssembly. This buys you a cleaner command name in combination with the fact that you don't have to exit an active sbt-shell session to call for example testAssembly. My sbt-shell sessions tend to live for a long time so I personally prefer the second option.

Combine several sbt tasks into one

I'm a little confused on the Scala/SBT documentation for creating Scala tasks. Currently I can run the following from the command line:
sbt ";set target := file(\"$PWD/package/deb-upstart\"); set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart; debian:packageBin; set target := file(\"$PWD/package/deb-systemv\"); set serverLoading in Debian := com.typesafe.sbt.packager.archtypes.ServerLoader.SystemV; debian:packageBin; set target := file(\"$PWD/package/rpm-systemd\"); rpm:packageBin"
This resets my target each time to a different directory (deb-upstart, deb-systemv and rpm-systemd) and runs an sbt-native-package task for each of those settings. (Yes, I realizing I'm compiling it three different times; but sbt-native-packager doesn't seems to have a setting for the artifact directory)
This works fine from a bash prompt, but I've been trying to put the same target into jenkins (replacing $PWD with $WORKSPACE) and I can't seem to get the quote escaping correct. I thought it might be easier just to have a task in either by build.sbt or project/Build.scala that runs all three of those tasks, changing out the target variable each time (and replacing $PWD or $TARGET with the full path of the base directory).
I've attempted the following:
lazy val packageAll = taskKey[Unit]("Creates deb-upstart, deb-systenv and rpm-systemd packages")
packageAll := {
target := baseDirectory.value / "package" / "deb-upstart"
serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart
(packageBin in Debian).value
target := baseDirectory.value / "package" / "deb-systemv"
serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.SystemV
(packageBin in Debian).value
target := baseDirectory.value / "package" / "rpm-systemd"
(packageBin in Rpm).value
}
But the trouble is the .value causes the tasks to get evaluated before my task is even run, so they don't get the new target setting (as stated in this other question: How can I call another task from my SBT task?)
So, I figured this out for you :)
As you already mentioned, combining a multiple tasks in a single one where some of the tasks depend on the same setting, doesn't work out as expected.
Instead we do the following
Create a task for each of our custom steps, e.g. packaging debian for upstart
Define an alias that executes these commands in order
Define tasks
lazy val packageDebianUpstart = taskKey[File]("creates deb-upstart package")
lazy val packageDebianSystemV = taskKey[File]("creates deb-systenv package")
lazy val packageRpmSystemD = taskKey[File]("creates rpm-systenv package")
Example task implementation
The implementation is pretty simple and the same for each of
the tasks.
// don't forget this
import com.typesafe.sbt.packager.archetypes._
packageDebianUpstart := {
// define where you want to put your stuff
val output = baseDirectory.value / "package" / "deb-upstart"
// run task
val debianFile = (packageBin in Debian).value
// place it where you want
IO.move(debianFile, output)
output
}
Define alias
And now compose these tasks into a single alias with
addCommandAlias("packageAll", "; clean " +
"; set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.SystemV" +
"; packageDebianSystemV " +
"; clean " +
"; set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart" +
"; packageDebianUpstart " +
"; packageRpmSystemD")
You can look at the complete code here.
Update
Setting the SystemLoader inside the alias seems to be the right
way to solve this. A clean is unfortunately necessary between
each build for the same output format.