How can I set javaOptions in a custom sbt command - scala

I am trying to add a custom command to SBT (using version 1.2.8 at the moment) to start my application with different javaOptions. I managed to create 3 separate tasks to accomplish this for three of the startup possibilities, but now I want to generalize this to allow any startup parameter.
So far I managed to create the following build.sbt, but the console output shows that the sbt runNode node3 still uses the original classpath as set on line 13 (run / javaOptions ++= Seq(...).
The run / javaOptions += s"-Djava.library.path=./target/native/$arg" on line 46 is ignored apparently.
import com.typesafe.sbt.SbtMultiJvm.multiJvmSettings
import com.typesafe.sbt.SbtMultiJvm.MultiJvmKeys.MultiJvm
import Dependencies._
lazy val `akka-crdt-features` = project
.in(file("."))
.settings(multiJvmSettings: _*)
.settings(
organization := "nl.about42.akkamavericks",
scalaVersion := "2.12.8",
Compile / scalacOptions ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls", "-Xlint"),
Compile / javacOptions ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
run / javaOptions ++= Seq("-Xms128m", "-Xmx1024m", "-Djava.library.path=./target/native"),
//run / javaOptions ++= Seq("-agentlib:hprof=heap=dump,format=b"),
libraryDependencies ++= akkaDependencies ++ otherDependencies,
run / fork := true,
Compile / run / mainClass := Some("nl.about42.akkamavericks.cluster.ClusterCrdtApp"),
// disable parallel tests
Test / parallelExecution := false,
licenses := Seq(("CC BY 4.0", url("https://creativecommons.org/licenses/by/4.0/"))),
commands ++= Seq(runNodeCommand),
Global / cancelable := true
)
.configs (MultiJvm)
// setup commands to run each individual node, using a separate folder for the extracted libsigar
lazy val runNode1 = taskKey[Unit]("Run node 1")
lazy val runNode2 = taskKey[Unit]("Run node 2")
lazy val runNode3 = taskKey[Unit]("Run node 3")
runNode1 / fork := true
runNode2 / fork := true
runNode3 / fork := true
runNode1 / javaOptions += "-Djava.library.path=./target/native/node1"
runNode2 / javaOptions += "-Djava.library.path=./target/native/node2"
runNode3 / javaOptions += "-Djava.library.path=./target/native/node3"
fullRunTask(runNode1, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node1")
fullRunTask(runNode2, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node2")
fullRunTask(runNode3, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node3")
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
val runCommand: Exec = Exec.apply(s"run $arg", state.source)
state.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)
The output of sbt runNode node3 shows (relevant lines):
[error] no libsigar-amd64-linux.so in java.library.path: [./target/native]
[error] org.hyperic.sigar.SigarException: no libsigar-amd64-linux.so in java.library.path: [./target/native]
I expect it to mention ./target/native/node3.
My goal is to have just the sbt command definition, so I can call runNode [anyNodeName] and have SBT start my application with the appropriate classpath setting and startup argument.
Update:
I partially succeeded with the following:
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val runCommand: Exec = Exec.apply(s"run $arg", stateWithNewOptions.source)
stateWithNewOptions.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
But that leaves the classpath set to the latest run (does not revert back to the default).

With the help of Mario Galic (https://stackoverflow.com/a/54488121/2037054) who answered Conditional scalacSettings / settingKey, I managed to get it working:
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val (s, _) = Project.extract(stateWithNewOptions).runInputTask(Compile / run, s" $arg", stateWithNewOptions)
s
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)

Related

sbt conditional if else style configuration

I want to be able to switch between parallel and serial execution of scala tests using command line.
Working example with "test.par" system property:
val parallelTestOpt = Option(System.getProperty("test.par"))
testOptions in IntegrationTest += Tests.Argument(
//Configure distributor's pool size
parallelTestOpt.map(count =>s"-P$count").getOrElse("-P1")
)
lazy val root = (project in file("."))
.configs(IntegrationTest)
.settings(Defaults.itSettings,
//If suites are executed in parallel
IntegrationTest / parallelExecution := parallelTestOpt.exists(_ != "1"),
IntegrationTest / testForkedParallel := parallelTestOpt.exists(_ != "1")
)
The "problematic" part is the parallelTestOpt.map(count =>s"-P$count").getOrElse("-P1"). I don't want to provide default value "-P1" when the "test.par" property was not specified. What is the best practice to achieve that ?
Maybe the whole concept is wrong and I should do it in a different way ?
As an alternative approach consider separating the parallelism concern into a single-argument custom command
commands += Command.single("testpar") { (state, numOfThreads) =>
s"""set IntegrationTest / testOptions += Tests.Argument("-P$numOfThreads")"""::
"set IntegrationTest / parallelExecution := true" ::
"set IntegrationTest / testForkedParallel := true" ::
"IntegrationTest / test" :: state
}
and execute with, say, testpar 6 to run with pool of 6 threads.
Addressing the comment, for compile-time safety try
commands += Command.single("testpar") { (state, numOfThreads) =>
val extracted = Project.extract(state)
val stateWithParallel= extracted.appendWithSession(
Seq(
IntegrationTest / testOptions += Tests.Argument(s"-P$numOfThreads"),
IntegrationTest / parallelExecution := true,
IntegrationTest / testForkedParallel := true,
),
state
)
extracted.runTask(IntegrationTest / test, stateWithParallel)
state
}

changing settings in sbt task

I'm trying to defined custom task in sbt that will run main class in debug mode.
lazy val root = (project in file("."))
.settings(
fork in run := true
)
lazy val runDebug = inputKey[Unit]("run in debug")
runDebug := {
javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005" //this doesn't work
(run in Compile).evaluated
},
I cannot make sbt to set javaOptions correctly. How to use Def.settings with inputTask to define another inputTask ?
Tasks cannot modify settings, instead try commands like so
commands += Command.command("runDebug") { state =>
s"""set javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005""""::
"run in Compile" :: state
}

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.

SBT: How to make one task depend on another in multi-project builds, and not run in the root project?

For my multi-project build, I'm trying to create a verify task that just results in scct:test and then scalastyle being executed in order. I would like scct:test to execute for all the subprojects, but not the top-level project. (If it executes for the top-level project, I get "timed out waiting for coverage report" from scct, since there's no source and no tests in that project.) What I had thought to do was to create verify as a task with dependencies on scct:test and scalastyle. This has turned out to be fairly baroque. Here is my Build.scala from my top-level project/ directory:
object MyBuild extends Build {
val verifyTask = TaskKey[Unit]("verify", "Compiles, runs tests via scct:test and then runs scalastyle")
val scctTestTask = (test in ScctPlugin.Scct).scopedKey
val scalastyleTask = PluginKeys.scalastyleTarget.scopedKey
lazy val root = Project("rootProject",
file("."),
settings = Defaults.defaultSettings ++
ScalastylePlugin.Settings ++
ScctPlugin.instrumentSettings ++
ScctPlugin.mergeReportSettings ++
Seq(
verifyTask in Global := {},
verifyTask <<= verifyTask.dependsOn(scctTestTask, scalastyleTask)
)
) aggregate(lift_webapp, selenium_tests)
lazy val subproject_1 = Project(id = "subproject_1", base = file("subproject_1"))
lazy val subproject_2 = Project(id = "subproject_2", base = file("subproject_2"))
}
However, the verify task only seems to exist for the root project; when I run it I don't see the same task being run in the subprojects. This is exactly the opposite of what I want; I'd like to issue sbt verify and have scct:test and scalastyle run in each of the subprojects but not in the top-level project. How might I go about doing that?
solution 1: define verifyTask in subprojects
First thing to note is that if you want some task (verify, test, etc) to run in some projects, you need to define them scoped to the subprojects. So in your case, the most straightforward thing to do this is to define verifyTask in subproject_1 and subproject_2.
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.4"
lazy val verify = taskKey[Unit]("verify")
def verifySettings = Seq(
skip in verify := false,
verify := (Def.taskDyn {
val sk = (skip in verify).value
if (sk) Def.task { println("skipping verify...") }
else (test in Test)
}).value
)
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
verifySettings,
scalaVersion in ThisBuild := "2.12.4",
skip in verify := true
)
lazy val sub1 = (project in file("sub1"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
lazy val sub2 = (project in file("sub2"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
solution 2: ScopeFilter
There was a recent Reddit thread that mentioned this question, so I'll post what I've done there.
If you want to manually aggregate on some subprojects, there's also a technique called ScopeFilter.
Note that I am using sbt 1.x here, but it should work with sbt 0.13 some minor change.
lazy val packageAll = taskKey[Unit]("package all the projects")
lazy val myTask = inputKey[Unit]("foo")
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
scalaVersion in ThisBuild := "2.12.4",
packageAll := {
(packageBin in Compile).all(nonRootsFilter).value
()
},
myTask := {
packageAll.value
}
)
lazy val sub1 = (project in file("sub1"))
lazy val sub2 = (project in file("sub2"))
def nonRootsFilter = {
import sbt.internal.inc.ReflectUtilities
def nonRoots: List[ProjectReference] =
allProjects filter {
case LocalProject(p) => p != "root"
case _ => false
}
def allProjects: List[ProjectReference] =
ReflectUtilities.allVals[Project](this).values.toList map { p =>
p: ProjectReference
}
ScopeFilter(inProjects(nonRoots: _*), inAnyConfiguration)
}
In the above, myTask depends on packageAll, which aggregates (packageBin in Compile) for all non-root subprojects.
sbt:root> myTask
[info] Packaging /Users/xxx/packageall/sub1/target/scala-2.12/sub1_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Packaging /Users/xxx/packageall/sub2/target/scala-2.12/sub2_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Feb 2, 2018 7:23:23 PM
I may be wrong, but you are defining the verify task dependency only for the current project.
Maybe you can try:
Seq(
verifyTask in Global := {},
verifyTask <<= (verifyTask in Global).dependsOn(scctTestTask, scalastyleTask)
)
Or you can add the verifyTask settings to all your modules.

Is it possible to re-launch and test xsbti.AppMain derived application from sbt?

I'm developing an sbt launched application with custom command line interface.
The problem is that every time I want to test it I have to remove the previously published boot directory and then recompile and publish locally the artefacts, and then finally run the app and test it manually. Part of this is accomplished by running external shell scripts.
How could I make sbt doing the job for me? I've already made the skeleton command for it:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
commands ++= Seq(launchApp))
)
val launchApp = Command.command("launch") { state =>
state.log.info("Re-launching app")
state
}
Create launcher configuration file, e.g. fqb.build.properties in the project's main directory.
Create a script that launches the application
#!/usr/bin/env bash
java -jar /path/to/sbt-launch.jar "$#"
Define task and command:
lazy val launcherTask = TaskKey[Unit]("launch", "Starts the application from the locally published JAR")
lazy val launchApp: Seq[Setting[_]] = Seq(
commands += Command.command("publish-launch") { state =>
state.log.info("Re-launching app")
val modulesProj = modules.id
s"$modulesProj/publishLocal" ::
"publishLocal" ::
launcherTask.key.label ::
state
},
launcherTask := {
"launch #fqb.build.properties" !<
}
)
Add it as a setting to a project:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
launchApp)
)
Remember to delete old ~/.<app_name> directory when re-deploying, so the changes could take effect.