changing settings in sbt task - scala

I'm trying to defined custom task in sbt that will run main class in debug mode.
lazy val root = (project in file("."))
.settings(
fork in run := true
)
lazy val runDebug = inputKey[Unit]("run in debug")
runDebug := {
javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005" //this doesn't work
(run in Compile).evaluated
},
I cannot make sbt to set javaOptions correctly. How to use Def.settings with inputTask to define another inputTask ?

Tasks cannot modify settings, instead try commands like so
commands += Command.command("runDebug") { state =>
s"""set javaOptions in run += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005""""::
"run in Compile" :: state
}

Related

How can I set javaOptions in a custom sbt command

I am trying to add a custom command to SBT (using version 1.2.8 at the moment) to start my application with different javaOptions. I managed to create 3 separate tasks to accomplish this for three of the startup possibilities, but now I want to generalize this to allow any startup parameter.
So far I managed to create the following build.sbt, but the console output shows that the sbt runNode node3 still uses the original classpath as set on line 13 (run / javaOptions ++= Seq(...).
The run / javaOptions += s"-Djava.library.path=./target/native/$arg" on line 46 is ignored apparently.
import com.typesafe.sbt.SbtMultiJvm.multiJvmSettings
import com.typesafe.sbt.SbtMultiJvm.MultiJvmKeys.MultiJvm
import Dependencies._
lazy val `akka-crdt-features` = project
.in(file("."))
.settings(multiJvmSettings: _*)
.settings(
organization := "nl.about42.akkamavericks",
scalaVersion := "2.12.8",
Compile / scalacOptions ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls", "-Xlint"),
Compile / javacOptions ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
run / javaOptions ++= Seq("-Xms128m", "-Xmx1024m", "-Djava.library.path=./target/native"),
//run / javaOptions ++= Seq("-agentlib:hprof=heap=dump,format=b"),
libraryDependencies ++= akkaDependencies ++ otherDependencies,
run / fork := true,
Compile / run / mainClass := Some("nl.about42.akkamavericks.cluster.ClusterCrdtApp"),
// disable parallel tests
Test / parallelExecution := false,
licenses := Seq(("CC BY 4.0", url("https://creativecommons.org/licenses/by/4.0/"))),
commands ++= Seq(runNodeCommand),
Global / cancelable := true
)
.configs (MultiJvm)
// setup commands to run each individual node, using a separate folder for the extracted libsigar
lazy val runNode1 = taskKey[Unit]("Run node 1")
lazy val runNode2 = taskKey[Unit]("Run node 2")
lazy val runNode3 = taskKey[Unit]("Run node 3")
runNode1 / fork := true
runNode2 / fork := true
runNode3 / fork := true
runNode1 / javaOptions += "-Djava.library.path=./target/native/node1"
runNode2 / javaOptions += "-Djava.library.path=./target/native/node2"
runNode3 / javaOptions += "-Djava.library.path=./target/native/node3"
fullRunTask(runNode1, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node1")
fullRunTask(runNode2, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node2")
fullRunTask(runNode3, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node3")
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
val runCommand: Exec = Exec.apply(s"run $arg", state.source)
state.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)
The output of sbt runNode node3 shows (relevant lines):
[error] no libsigar-amd64-linux.so in java.library.path: [./target/native]
[error] org.hyperic.sigar.SigarException: no libsigar-amd64-linux.so in java.library.path: [./target/native]
I expect it to mention ./target/native/node3.
My goal is to have just the sbt command definition, so I can call runNode [anyNodeName] and have SBT start my application with the appropriate classpath setting and startup argument.
Update:
I partially succeeded with the following:
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val runCommand: Exec = Exec.apply(s"run $arg", stateWithNewOptions.source)
stateWithNewOptions.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
But that leaves the classpath set to the latest run (does not revert back to the default).
With the help of Mario Galic (https://stackoverflow.com/a/54488121/2037054) who answered Conditional scalacSettings / settingKey, I managed to get it working:
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val (s, _) = Project.extract(stateWithNewOptions).runInputTask(Compile / run, s" $arg", stateWithNewOptions)
s
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)

How to create a custom sbt task that sets java options before running the app

I'd like to have a new sbt task runDev that is the equivalent of setting a system property first, then running run:
sbt '; set javaOptions += "-Dlogback.configurationFile=logback-dev.xml" ; run'
How can I do this in sbt?
lazy val runDev = taskKey[Unit]("Run with custom java options")
fork in runDev := true
javaOptions in runDev += "-Dlogback.configurationFile=logback-dev.xml"
fullRunTask(runDev, Compile, "mainClass")
One way:
lazy val helloRun = inputKey[Unit]("Run as a task")
helloRun := {
javacOptions += "-Dlogback.configurationFile=logback-dev.xml"
(run in Compile).evaluated
}

Scala integration test not picking config properties

I have an Scala project with integration test with following folder structure
My Project
- app
- it
- com.anjib.my.pkg
- resources
- application.regression.devl.conf
I want to overwrite one of the properties by putting in application.regression.devl.conf file but while running integration test its still pulling top level property.
For example some where in project there is
someKey=someValue
I put
somekey=otherValue
in application.regression.devl.conf. But integration tests are still picking someValue
My build.sbt looks like
lazy val `my-project` = (project in file("."))
....
.configs( IntegrationTest )
.settings( Defaults.itSettings : _*)
....
)
Config loading as
def apply(): Config = {
val environment = determineEnvironment
val defaultConfig = ConfigFactory.load()
val envConfig: Config = ConfigFactory.load(s"application.$environment.conf")
val regressionSuiteConfig = ConfigFactory.load(s"application.regression.$environment.conf")
regressionSuiteConfig.withFallback(envConfig).withFallback(defaultConfig)
}
Update: ID I do Ctrl + Shift + F10 in IntelliJ it did pick up otherValue. So it has issue only with sbt it:test
Try to set the javaOptions in sbt like this:
javaOptions in IntegrationTest += "-Dconfig.resource=" + System.getProperty("config.resource", "application.regression.devl.conf")
With this sbt will set the config.resource parameter to "application.regression.devl.conf" if nothing is passed explicitly.

SBT: How to make one task depend on another in multi-project builds, and not run in the root project?

For my multi-project build, I'm trying to create a verify task that just results in scct:test and then scalastyle being executed in order. I would like scct:test to execute for all the subprojects, but not the top-level project. (If it executes for the top-level project, I get "timed out waiting for coverage report" from scct, since there's no source and no tests in that project.) What I had thought to do was to create verify as a task with dependencies on scct:test and scalastyle. This has turned out to be fairly baroque. Here is my Build.scala from my top-level project/ directory:
object MyBuild extends Build {
val verifyTask = TaskKey[Unit]("verify", "Compiles, runs tests via scct:test and then runs scalastyle")
val scctTestTask = (test in ScctPlugin.Scct).scopedKey
val scalastyleTask = PluginKeys.scalastyleTarget.scopedKey
lazy val root = Project("rootProject",
file("."),
settings = Defaults.defaultSettings ++
ScalastylePlugin.Settings ++
ScctPlugin.instrumentSettings ++
ScctPlugin.mergeReportSettings ++
Seq(
verifyTask in Global := {},
verifyTask <<= verifyTask.dependsOn(scctTestTask, scalastyleTask)
)
) aggregate(lift_webapp, selenium_tests)
lazy val subproject_1 = Project(id = "subproject_1", base = file("subproject_1"))
lazy val subproject_2 = Project(id = "subproject_2", base = file("subproject_2"))
}
However, the verify task only seems to exist for the root project; when I run it I don't see the same task being run in the subprojects. This is exactly the opposite of what I want; I'd like to issue sbt verify and have scct:test and scalastyle run in each of the subprojects but not in the top-level project. How might I go about doing that?
solution 1: define verifyTask in subprojects
First thing to note is that if you want some task (verify, test, etc) to run in some projects, you need to define them scoped to the subprojects. So in your case, the most straightforward thing to do this is to define verifyTask in subproject_1 and subproject_2.
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.4"
lazy val verify = taskKey[Unit]("verify")
def verifySettings = Seq(
skip in verify := false,
verify := (Def.taskDyn {
val sk = (skip in verify).value
if (sk) Def.task { println("skipping verify...") }
else (test in Test)
}).value
)
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
verifySettings,
scalaVersion in ThisBuild := "2.12.4",
skip in verify := true
)
lazy val sub1 = (project in file("sub1"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
lazy val sub2 = (project in file("sub2"))
.settings(
verifySettings,
libraryDependencies += scalaTest % Test
)
solution 2: ScopeFilter
There was a recent Reddit thread that mentioned this question, so I'll post what I've done there.
If you want to manually aggregate on some subprojects, there's also a technique called ScopeFilter.
Note that I am using sbt 1.x here, but it should work with sbt 0.13 some minor change.
lazy val packageAll = taskKey[Unit]("package all the projects")
lazy val myTask = inputKey[Unit]("foo")
lazy val root = (project in file("."))
.aggregate(sub1, sub2)
.settings(
scalaVersion in ThisBuild := "2.12.4",
packageAll := {
(packageBin in Compile).all(nonRootsFilter).value
()
},
myTask := {
packageAll.value
}
)
lazy val sub1 = (project in file("sub1"))
lazy val sub2 = (project in file("sub2"))
def nonRootsFilter = {
import sbt.internal.inc.ReflectUtilities
def nonRoots: List[ProjectReference] =
allProjects filter {
case LocalProject(p) => p != "root"
case _ => false
}
def allProjects: List[ProjectReference] =
ReflectUtilities.allVals[Project](this).values.toList map { p =>
p: ProjectReference
}
ScopeFilter(inProjects(nonRoots: _*), inAnyConfiguration)
}
In the above, myTask depends on packageAll, which aggregates (packageBin in Compile) for all non-root subprojects.
sbt:root> myTask
[info] Packaging /Users/xxx/packageall/sub1/target/scala-2.12/sub1_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Packaging /Users/xxx/packageall/sub2/target/scala-2.12/sub2_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Feb 2, 2018 7:23:23 PM
I may be wrong, but you are defining the verify task dependency only for the current project.
Maybe you can try:
Seq(
verifyTask in Global := {},
verifyTask <<= (verifyTask in Global).dependsOn(scctTestTask, scalastyleTask)
)
Or you can add the verifyTask settings to all your modules.

Is it possible to re-launch and test xsbti.AppMain derived application from sbt?

I'm developing an sbt launched application with custom command line interface.
The problem is that every time I want to test it I have to remove the previously published boot directory and then recompile and publish locally the artefacts, and then finally run the app and test it manually. Part of this is accomplished by running external shell scripts.
How could I make sbt doing the job for me? I've already made the skeleton command for it:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
commands ++= Seq(launchApp))
)
val launchApp = Command.command("launch") { state =>
state.log.info("Re-launching app")
state
}
Create launcher configuration file, e.g. fqb.build.properties in the project's main directory.
Create a script that launches the application
#!/usr/bin/env bash
java -jar /path/to/sbt-launch.jar "$#"
Define task and command:
lazy val launcherTask = TaskKey[Unit]("launch", "Starts the application from the locally published JAR")
lazy val launchApp: Seq[Setting[_]] = Seq(
commands += Command.command("publish-launch") { state =>
state.log.info("Re-launching app")
val modulesProj = modules.id
s"$modulesProj/publishLocal" ::
"publishLocal" ::
launcherTask.key.label ::
state
},
launcherTask := {
"launch #fqb.build.properties" !<
}
)
Add it as a setting to a project:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
launchApp)
)
Remember to delete old ~/.<app_name> directory when re-deploying, so the changes could take effect.