Is it possible to re-launch and test xsbti.AppMain derived application from sbt? - scala

I'm developing an sbt launched application with custom command line interface.
The problem is that every time I want to test it I have to remove the previously published boot directory and then recompile and publish locally the artefacts, and then finally run the app and test it manually. Part of this is accomplished by running external shell scripts.
How could I make sbt doing the job for me? I've already made the skeleton command for it:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
commands ++= Seq(launchApp))
)
val launchApp = Command.command("launch") { state =>
state.log.info("Re-launching app")
state
}

Create launcher configuration file, e.g. fqb.build.properties in the project's main directory.
Create a script that launches the application
#!/usr/bin/env bash
java -jar /path/to/sbt-launch.jar "$#"
Define task and command:
lazy val launcherTask = TaskKey[Unit]("launch", "Starts the application from the locally published JAR")
lazy val launchApp: Seq[Setting[_]] = Seq(
commands += Command.command("publish-launch") { state =>
state.log.info("Re-launching app")
val modulesProj = modules.id
s"$modulesProj/publishLocal" ::
"publishLocal" ::
launcherTask.key.label ::
state
},
launcherTask := {
"launch #fqb.build.properties" !<
}
)
Add it as a setting to a project:
lazy val root = Project(
id = "app",
base = file("."),
settings = buildSettings ++ Seq( resolvers := rtResolvers,
libraryDependencies ++= libs,
scalacOptions ++= Seq("-encoding", "UTF-8", "-deprecation", "-unchecked"),
launchApp)
)
Remember to delete old ~/.<app_name> directory when re-deploying, so the changes could take effect.

Related

how to run scala sbt-native-packager for a appJS/appJVM cross-build project

The sbt-native-packager can make a zip file with all dependencies and a script to run_
$ sbt universal:packageBin
I have a scala web application, using cross-build (appJS for front-end and appJVM for back-end).
How do I run this packager for the appJVM?
I've tried as follows, but it does not accept the command:
$ sbt appJVM/universal:packageBin
Here it is the build.sbt project, from https://www.scala-js.org/doc/project/cross-build.html
...
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
).
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
How do I run this packager for the appJVM?
And how I include the file generated by sbt appJS/fullOptJS?
And some other static files?
Update with Ivan response
build.sbt:
import sbtcrossproject.CrossPlugin.autoImport.{crossProject, CrossType}
val sharedSettings = Seq(
scalaVersion := "2.12.8",
)
lazy val app =
crossProject(JSPlatform, JVMPlatform)
.in(file("."))
.settings(sharedSettings)
.jsSettings(
)
.jvmSettings(
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http" % "10.1.9"
),
)
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(app.jvm)
.settings(
mainClass in Compile := Some("com.example.EchoServer")
)
lazy val frontend = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(app.js)
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in frontend).value,
(fastOptJS in Compile in frontend).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
)
and run:
$ sbt backend/universal:packageBin
34: error: type mismatch;
found : Seq[sbt.Def.Setting[Seq[sbt.Task[Seq[java.io.File]]]]]
required: Int
Seq(
^
[error] Type error in expression
I used the following structure.
Define a shared project that needs to be cross-compiled for JS and Scala.
lazy val shared = CrossPlugin.autoImport
.crossProject(JSPlatform, JVMPlatform)
.crossType(CrossType.Pure)
.jvmSettings(???)
.jsSettings(???)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
Add project that contains a Main class.
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(sharedJvm)
Add web project containing web related code.
lazy val web = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(sharedJs)
And finally, attach resources from web compiled into JS to backend.
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in web).value,
(fastOptJS in Compile in web).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
Main class needs to service compiled JS from public/assets, as configured in sbt, and any other web resources from its class path.

How can I set javaOptions in a custom sbt command

I am trying to add a custom command to SBT (using version 1.2.8 at the moment) to start my application with different javaOptions. I managed to create 3 separate tasks to accomplish this for three of the startup possibilities, but now I want to generalize this to allow any startup parameter.
So far I managed to create the following build.sbt, but the console output shows that the sbt runNode node3 still uses the original classpath as set on line 13 (run / javaOptions ++= Seq(...).
The run / javaOptions += s"-Djava.library.path=./target/native/$arg" on line 46 is ignored apparently.
import com.typesafe.sbt.SbtMultiJvm.multiJvmSettings
import com.typesafe.sbt.SbtMultiJvm.MultiJvmKeys.MultiJvm
import Dependencies._
lazy val `akka-crdt-features` = project
.in(file("."))
.settings(multiJvmSettings: _*)
.settings(
organization := "nl.about42.akkamavericks",
scalaVersion := "2.12.8",
Compile / scalacOptions ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls", "-Xlint"),
Compile / javacOptions ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
run / javaOptions ++= Seq("-Xms128m", "-Xmx1024m", "-Djava.library.path=./target/native"),
//run / javaOptions ++= Seq("-agentlib:hprof=heap=dump,format=b"),
libraryDependencies ++= akkaDependencies ++ otherDependencies,
run / fork := true,
Compile / run / mainClass := Some("nl.about42.akkamavericks.cluster.ClusterCrdtApp"),
// disable parallel tests
Test / parallelExecution := false,
licenses := Seq(("CC BY 4.0", url("https://creativecommons.org/licenses/by/4.0/"))),
commands ++= Seq(runNodeCommand),
Global / cancelable := true
)
.configs (MultiJvm)
// setup commands to run each individual node, using a separate folder for the extracted libsigar
lazy val runNode1 = taskKey[Unit]("Run node 1")
lazy val runNode2 = taskKey[Unit]("Run node 2")
lazy val runNode3 = taskKey[Unit]("Run node 3")
runNode1 / fork := true
runNode2 / fork := true
runNode3 / fork := true
runNode1 / javaOptions += "-Djava.library.path=./target/native/node1"
runNode2 / javaOptions += "-Djava.library.path=./target/native/node2"
runNode3 / javaOptions += "-Djava.library.path=./target/native/node3"
fullRunTask(runNode1, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node1")
fullRunTask(runNode2, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node2")
fullRunTask(runNode3, Compile, "nl.about42.akkamavericks.cluster.ClusterCrdtApp", "node3")
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
val runCommand: Exec = Exec.apply(s"run $arg", state.source)
state.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)
The output of sbt runNode node3 shows (relevant lines):
[error] no libsigar-amd64-linux.so in java.library.path: [./target/native]
[error] org.hyperic.sigar.SigarException: no libsigar-amd64-linux.so in java.library.path: [./target/native]
I expect it to mention ./target/native/node3.
My goal is to have just the sbt command definition, so I can call runNode [anyNodeName] and have SBT start my application with the appropriate classpath setting and startup argument.
Update:
I partially succeeded with the following:
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val runCommand: Exec = Exec.apply(s"run $arg", stateWithNewOptions.source)
stateWithNewOptions.copy(
remainingCommands = runCommand +: state.remainingCommands
)
}
But that leaves the classpath set to the latest run (does not revert back to the default).
With the help of Mario Galic (https://stackoverflow.com/a/54488121/2037054) who answered Conditional scalacSettings / settingKey, I managed to get it working:
// setup command to start a single node, using separate folder for the extracted libsigar
// assumes sane node names that can be used as folder names
val runNodeAction: (State, String) => State = { (state, arg) =>
val stateWithNewOptions = Project.extract(state).appendWithSession(
Seq(
run / javaOptions += s"-Djava.library.path=./target/native/$arg"
),
state
)
val (s, _) = Project.extract(stateWithNewOptions).runInputTask(Compile / run, s" $arg", stateWithNewOptions)
s
}
val runNodeCommand: Command = Command.single("runNode")(runNodeAction)

How to run main project with all sub-project in scala SBT

I'm using multi-project build in SBT to create a simple microservice architecture. I defined build.sbt as:
addCommandAlias("rebuild", ";clean; compile; package")
lazy val oc = Project(id = "oc",
base = file("."),
settings = commonSettings).aggregate(
persistence,
core,
configuration,
integration).dependsOn(
persistence,
core,
configuration,
integration)
.configs (MultiJvm)
lazy val persistence = Project(...)
lazy val core = Project(...)
lazy val log = Project(...)
lazy val configuration = Project(...)
lazy val integration = Project(...)
lazy val commonSettings = Defaults.coreDefaultSettings ++
basicSettings ++
net.virtualvoid.sbt.graph.Plugin.graphSettings
lazy val basicSettings = Seq(
version := PROJECT_VERSION,
organization := ORGANIZATION,
scalaVersion := SCALA_VERSION,
scalacOptions in Compile ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls", "-Xlint" , "-encoding", "utf8"),
javacOptions in Compile ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
javaOptions in run ++= Seq("-Xms128m", "-Xmx1024m"),
libraryDependencies ++= Seq( ...
),
fork in run := true,
// disable parallel tests
parallelExecution in Test := false,
licenses := Seq(("CC0", url("http://creativecommons.org/publicdomain/zero/1.0"))),
fork in Test := true
)
For single sub-project I can easily build the project. But I cannot build all projects at once to check the working as a whole system. Any help from the experts on building the system would be appreciable

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.

include upstart scripts when running debian:packageBin

i want to start my scala app as a background service on a ubuntu machine. I'm currently trying to figure out how to use the JavaServerAppPackaging Settings in my package task.
Any suggestion, what I have to add where to include the upstart script in my deb file?
This is my Project description incl. oneJar and debian building.
lazy val root = Project(id = appName, base = file("."), settings = SbtOneJar.oneJarSettings ++ packageSettings ++ allSettings ++ Project.defaultSettings)
lazy val allSettings = Seq(
mainClass in SbtOneJar.oneJar := Some("Kernel"),
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases",
resolvers += "Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
libraryDependencies ++= dependencies)
lazy val packageSettings = JavaServerAppPackaging.settings ++ packagerSettings ++ Seq(
version := appVersion,
packageSummary := appName,
packageDescription := appName,
maintainer := appAuthor,
mappings in Universal += {
SbtOneJar.oneJar.value -> jarFsPath
},
linuxPackageMappings in Debian <+= (SbtOneJar.oneJar) map {
jar: File =>
(packageMapping(jar -> jarFsPath) withUser unixUser withGroup unixGroup withPerms "0755")
},
debianPackageDependencies in Debian ++= Seq("openjdk-7-jre-headless"))
}
The settings should include the packageArchetype.java_server value eg:
lazy val packageSettings = packageArchetype.java_server ++ Seq(
/* ... */
}