sbt conditional if else style configuration - scala

I want to be able to switch between parallel and serial execution of scala tests using command line.
Working example with "test.par" system property:
val parallelTestOpt = Option(System.getProperty("test.par"))
testOptions in IntegrationTest += Tests.Argument(
//Configure distributor's pool size
parallelTestOpt.map(count =>s"-P$count").getOrElse("-P1")
)
lazy val root = (project in file("."))
.configs(IntegrationTest)
.settings(Defaults.itSettings,
//If suites are executed in parallel
IntegrationTest / parallelExecution := parallelTestOpt.exists(_ != "1"),
IntegrationTest / testForkedParallel := parallelTestOpt.exists(_ != "1")
)
The "problematic" part is the parallelTestOpt.map(count =>s"-P$count").getOrElse("-P1"). I don't want to provide default value "-P1" when the "test.par" property was not specified. What is the best practice to achieve that ?
Maybe the whole concept is wrong and I should do it in a different way ?

As an alternative approach consider separating the parallelism concern into a single-argument custom command
commands += Command.single("testpar") { (state, numOfThreads) =>
s"""set IntegrationTest / testOptions += Tests.Argument("-P$numOfThreads")"""::
"set IntegrationTest / parallelExecution := true" ::
"set IntegrationTest / testForkedParallel := true" ::
"IntegrationTest / test" :: state
}
and execute with, say, testpar 6 to run with pool of 6 threads.
Addressing the comment, for compile-time safety try
commands += Command.single("testpar") { (state, numOfThreads) =>
val extracted = Project.extract(state)
val stateWithParallel= extracted.appendWithSession(
Seq(
IntegrationTest / testOptions += Tests.Argument(s"-P$numOfThreads"),
IntegrationTest / parallelExecution := true,
IntegrationTest / testForkedParallel := true,
),
state
)
extracted.runTask(IntegrationTest / test, stateWithParallel)
state
}

Related

How to apply common test configuration to all projects?

I'm migrating an old project to Scala 3. The build.sbt is as follows:
import Dependencies._
inThisBuild(
Seq(
scalaVersion := "2.12.7",
scalacOptions ++= Seq(
"-unchecked",
// more
)
)
++ inConfig(Test)(Seq(
testOptions += Tests.Argument(TestFrameworks.ScalaTest, "-o", "-e"),
// more
))
)
lazy val root = (project in file("."))
.aggregate(
`test-util`
)
lazy val `test-util` = project
Now, I want to separate stuff inside inThisBuild for legibility.
import Dependencies._
ThisBuild / scalaVersion := "3.0.1"
ThisBuild / scalacOptions ++= Seq(
"-unchecked",
// more
)
lazy val testSettings = inConfig(Test)(
Seq(
testOptions += Tests.Argument(TestFrameworks.ScalaTest, "-o", "-e"),
// more
))
lazy val root = (project in file("."))
.aggregate(
`test-util`
)
.settings(testSettings)
lazy val `test-util` = project
As you can see, I'm having to apply the testSettings for each project. Ideally, I'd like to do something like ThisBuild / Test := testSettings but that is not valid syntax.
Is there a way to apply the testSettings to all projects without having to explicitly set .settings(testSettings)?
Edit:
I understand I can write each line of testSettings with ThisBuild / Test prefix, but I’d rather not repeat the same prefix. I’m looking for something like what I’ve done with scalacOptions.
Is there a way to apply the testSettings to all projects without having to explicitly set .settings(testSettings)
Consider creating an auto plugin which can inject settings automatically in all the sub-projects, for example in project/CommonTestSettings.scala
import sbt._
import Keys._
object CommonTestSettings extends sbt.AutoPlugin {
override def requires = plugins.JvmPlugin
override def trigger = allRequirements
override lazy val projectSettings =
inConfig(Test)(
Seq(
testOptions += Tests.Argument(TestFrameworks.ScalaTest, "-o", "-e")
// more
)
)
}
You can test with show testOptions which should reveal the common settings in all the sub-projects, for example in my project where root aggregates foo and bar I get something like
sbt:sbt-multi-project> show testOptions
[info] foo / Test / testOptions
[info] List(Argument(Some(TestFramework(org.scalatest.tools.Framework, org.scalatest.tools.ScalaTestFramework)),List(-o, -e)))
[info] bar / Test / testOptions
[info] List(Argument(Some(TestFramework(org.scalatest.tools.Framework, org.scalatest.tools.ScalaTestFramework)),List(-o, -e)))
[info] Test / testOptions
[info] List(Argument(Some(TestFramework(org.scalatest.tools.Framework, org.scalatest.tools.ScalaTestFramework)),List(-o, -e)))
As you can done with scalaVersion and scalacOptions, you can do with Test. For example:
lazy val testSettings = inConfig(Test)(
Seq(
testOptions += Tests.Argument(TestFrameworks.ScalaTest, "-o", "-e"),
// more
))
Can be rewritten as:
ThisBuild / Test / testOptions += Test.Argument(TestFrameworks.ScalaTest, "-o", "-e")
Is it this that you want? Or do you want to pass a sequence directly?

changing settings in sbt task

I'm trying to defined custom task in sbt that will run main class in debug mode.
lazy val root = (project in file("."))
.settings(
fork in run := true
)
lazy val runDebug = inputKey[Unit]("run in debug")
runDebug := {
javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005" //this doesn't work
(run in Compile).evaluated
},
I cannot make sbt to set javaOptions correctly. How to use Def.settings with inputTask to define another inputTask ?
Tasks cannot modify settings, instead try commands like so
commands += Command.command("runDebug") { state =>
s"""set javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005""""::
"run in Compile" :: state
}

How can I set javaOptions in a custom sbt command

I am trying to add a custom command to SBT (using version 1.2.8 at the moment) to start my application with different javaOptions. I managed to create 3 separate tasks to accomplish this for three of the startup possibilities, but now I want to generalize this to allow any startup parameter.
So far I managed to create the following build.sbt, but the console output shows that the sbt runNode node3 still uses the original classpath as set on line 13 (run / javaOptions ++= Seq(...).
The run / javaOptions += s"-Djava.library.path=./target/native/$arg" on line 46 is ignored apparently.
import com.typesafe.sbt.SbtMultiJvm.multiJvmSettings
import com.typesafe.sbt.SbtMultiJvm.MultiJvmKeys.MultiJvm
import Dependencies._
lazy val `akka-crdt-features` = project
.in(file("."))
.settings(multiJvmSettings: _*)
.settings(
organization := "nl.about42.akkamavericks",
scalaVersion := "2.12.8",
Compile / scalacOptions ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls", "-Xlint"),
Compile / javacOptions ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
run / javaOptions ++= Seq("-Xms128m", "-Xmx1024m", "-Djava.library.path=./target/native"),
//run / javaOptions ++= Seq("-agentlib:hprof=heap=dump,format=b"),
libraryDependencies ++= akkaDependencies ++ otherDependencies,
run / fork := true,
Compile / run / mainClass := Some("nl.about42.akkamavericks.cluster.ClusterCrdtApp"),
// disable parallel tests
Test / parallelExecution := false,
licenses := Seq(("CC BY 4.0", url("https://creativecommons.org/licenses/by/4.0/"))),
commands ++= Seq(runNodeCommand),
Global / cancelable := true
)
.configs (MultiJvm)
// setup commands to run each individual node, using a separate folder for the extracted libsigar
lazy val runNode1 = taskKey[Unit]("Run node 1")
lazy val runNode2 = taskKey[Unit]("Run node 2")
lazy val runNode3 = taskKey[Unit]("Run node 3")
runNode1 / fork := true
runNode2 / fork := true
runNode3 / fork := true
runNode1 / javaOptions += "-Djava.library.path=./target/native/node1"
runNode2 / javaOptions += "-Djava.library.path=./target/native/node2"
runNode3 / javaOptions += "-Djava.library.path=./target/native/node3"
fullRunTask(runNode1, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node1")
fullRunTask(runNode2, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node2")
fullRunTask(runNode3, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node3")
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
val runCommand: Exec = Exec.apply(s"run $arg", state.source)
state.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)
The output of sbt runNode node3 shows (relevant lines):
[error] no libsigar-amd64-linux.so in java.library.path: [./target/native]
[error] org.hyperic.sigar.SigarException: no libsigar-amd64-linux.so in java.library.path: [./target/native]
I expect it to mention ./target/native/node3.
My goal is to have just the sbt command definition, so I can call runNode [anyNodeName] and have SBT start my application with the appropriate classpath setting and startup argument.
Update:
I partially succeeded with the following:
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val runCommand: Exec = Exec.apply(s"run $arg", stateWithNewOptions.source)
stateWithNewOptions.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
But that leaves the classpath set to the latest run (does not revert back to the default).
With the help of Mario Galic (https://stackoverflow.com/a/54488121/2037054) who answered Conditional scalacSettings / settingKey, I managed to get it working:
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val (s, _) = Project.extract(stateWithNewOptions).runInputTask(Compile / run, s" $arg", stateWithNewOptions)
s
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.

SBT doesn't call Test.Setup on a Play2 project

Here is my SBT build:
val main = play.Project(appName, appVersion, appDependencies).settings(defaultScalaSettings:_*)
.settings(
scalaVersion := "2.10.0",
resolvers += .....
)
.configs(IntegrationTest)
.settings( Defaults.itSettings : _*)
.settings(
testOptions in Test += Tests.Setup( () => println("Setup Test yoohooo") ),
testOptions in Test += Tests.Cleanup( () => println("Cleanup Test yoohoo") ),
scalaSource in Test <<= baseDirectory / "test/unit",
parallelExecution in Test := true,
testOptions in IntegrationTest += Tests.Setup( () => println("Setup Integration Test yoohoo") ),
testOptions in IntegrationTest += Tests.Cleanup( () => println("Cleanup Integration Test yoohoo") ),
scalaSource in IntegrationTest <<= baseDirectory / "test/integration",
parallelExecution in IntegrationTest := false
)
I can launch both tasks test and it:test, but it only prints the text for the IntegrationTest, and not for the regular Test.
I see that Play2 has some default settings related:
testOptions in Test += Tests.Setup { loader =>
loader.loadClass("play.api.Logger").getMethod("init", classOf[java.io.File]).invoke(null, new java.io.File("."))
},
testOptions in Test += Tests.Cleanup { loader =>
loader.loadClass("play.api.Logger").getMethod("shutdown").invoke(null)
},
Isn't my build supposed to override these settings?
By the way, can I call an external library or a test source class in this Setup?
Maybe this is constraint of sbt.
sbt official documents says
Setup and Cleanup actions are not supported when a group is forked.
https://github.com/sbt/sbt/blob/v0.12.2/src/sphinx/Detailed-Topics/Testing.rst#forking-tests
http://www.scala-sbt.org/0.12.2/docs/Detailed-Topics/Testing.html
fork in Test := true
is default from Play2.1.0
https://github.com/playframework/Play20/pull/654/files#L5L110