sbt-native-packager defining multiple mainClasses in different modules - scala

I would like to package multiple docker images, each one having its own mainClass, to ensure the app is running on startup.
lazy val `core` = project.in(file("core"))
.enablePlugins(JavaServerAppPackaging, DockerPlugin)
.settings{
mainClass in Compile := Some("path/to/Core") // Doesn't work
}
lazy val `benchmark` = project.in(file("benchmark"))
.enablePlugins(JavaServerAppPackaging, DockerPlugin)
.settings{
mainClass in Compile := Some("path/to/Benchmark") // Doesn't work
}
This does not work as the mainClasses are not found in the stage step.
When I define mainClass as a global parameter it works, but I can't build two auto-running Docker images this way.
Thanks for your help

I am not experienced with sbt-native-packager but mainClass is the classpath not the file path, so it must be defined as:
mainClass in (Compile, packageBin) := Some("com.bar.baz.Foo")

Related

Set mainClass for sbt native packager Universal

I have a project that has the following build.sbt:
addCommandAlias("package", "dist")
lazy val actual = (project in file("."))
.enablePlugins(UniversalPlugin, JavaServerAppPackaging)
.settings(
name := "DeployerPod",
mainClass := Some("com.myself.executable.Runner"),
Compile / mainClass := Some("com.myself.executable.Runner"),
Compile / run / mainClass := Some("com.myself.utils.Pipeline"),
Universal / mainClass := Some("com.myself.executable.Runner"),
Universal / compile / mainClass := Some("com.myself.executable.Runner"),
)
We have a CICD which runs a Dockerfile.
There I have sbt run as one of the steps, which will execute com.myself.utils.Pipeline class to run a Scala class and do the pre requisites for the pipeline.
As one of the last sbt based steps, I'm also running sbt package, which eventually runs an sbt dist command. At this point, inside the extracted ZIP's bin folder, I see two BAT files corresponding to the two main classes. Unfortunately I only want the Runner class BAT instead of Pipeline BAT.
For this I tried running sbt package -main com.myself.executable.Runner but that failed saying Not a valid command: -
Is there a way I can specify the mainClass only for this Universal plugin somehow? Because the way I've tried in my build.sbt doesn't seem to work.

Sbt generated docker container fails to package subproject

I have a multi-project build.sbt file, with projects like so:
lazy val utils = (project in file("utils"))
.settings(
Seq(
publishArtifact := false
)).[...]
lazy val api = (project in file("api"))
.dependsOn(utils)
.settings(commonSettings: _*)
.enablePlugins(JavaAppPackaging, DockerPlugin)
.settings(publish := {})
.settings(
Seq(
packageName in Docker := "my-api",
dockerBaseImage := "java:8",
mainClass in Compile := Some("com.path.to.Main"),
publishArtifact := false,
unmanagedJars in Compile += file("jars/somejars.jar")
))
API is built on top of Finch framework. I create a docker image for the API using sbt api/docker:publishLocal and then run it locally. However, it seems like the utils subproject classes are not packaged with the final container, and as a result I am getting multiple
java.lang.ClassNotFoundException:
types of exceptions. For a similar project that doesn't have a subproject dependency, everything runs smoothly and I have no problems.
Am I missing something in the plugin configuration? I thought .dependsOn() should be taking care of providing dependent classes in the project docker image.
Answering my own question, but turns out this is a default behaviour of sbt-native-packager, or rather sbt, when a dependent project has publishArtifact := false setting.
A workaround that worked for me was changing the above to publish/skip := true.
More on this issue can be found here: https://github.com/sbt/sbt-native-packager/issues/1221

Multiple main classes with SBT assembly

I'm looking to create jars for AWS Lambda to run job tasks. Currently my build.sbt file looks something like this:
lazy val commonSettings = Seq(...)
lazy val core = project
.settings(commonSettings: _*)
lazy val job = project
.settings(commonSettings: _*)
.dependsOn(core)
lazy val service = project
.settings(commonSettings: _*)
.settings(
mainClass in assembly := Some("io.example.service.Lambda"),
assemblyJarName in assembly := "lambda.jar"
)
.dependsOn(core)
Running sbt assembly assembles the service module into a jar for my API and that works fine. The module job however will have multiple Main classes (one pr. job) and when I run sbt assembly job the service module is also assembled (even through its not depended on).
How can I configure my setup to only assemble the job module when needed, and specify individual mainClasses as separately assembled jars?
Set mainClass in assembly in job to define which main class to use, and run job/assembly to just assemble the job assembly jar.
You will need to override the default main class at build time by setting the property explicitly.
sbt "; set mainClass in assembly := Some("another/class"); job/assembly"
Not sure it's good practice but alternatively you can define a sub-project for each job with the correct main class set.
lazy val job1 = project
.settings(commonSettings: _*)
.settings(
mainClass in assembly := Some("io.example.service.Lambda"),
assemblyJarName in assembly := "lambda.jar"
)
.dependsOn(core)
lazy val job2 = project
.settings(commonSettings: _*)
.settings(
mainClass in assembly := Some("io.example.service.Lambda2"),
assemblyJarName in assembly := "lambda2.jar"
)
.dependsOn(core)

SBT: How to set common scala version for multiproject

I have a multi-SBT-project in IntellJ Idea. My SBT file in the root dir looks like this:
name := "PlayRoot"
version := "1.0"
lazy val shapeless_learn = project.in(file("shapeless_learn")).dependsOn(common)
lazy val scalaz_learn = project.in(file("scalaz_learn")).dependsOn(common)
lazy val common = project.in(file("common"))
lazy val root = project.in(file(".")).aggregate(common, shapeless_learn, scalaz_learn)
scalaVersion := "2.11.7"
Then I have folders for each of the projects: ./common, ./shapeless_learn, ./scalaz_learn and each has its own build.sbt there. But for some reason I require to put in each of the subproject build.sbt files the line scalaVersion := "2.11.7".
If I forget to do that, the build fails with the message:
Error:Unresolved dependencies: common#common_2.10;0.1-SNAPSHOT: not found
See complete log in ...
For some reason if I do not specify that my scala version is 2.11.7, sbt falls back to 2.10 and tries to find common project that is built for 2.10 which I do not have.
I always keep forgetting adding scalaVersion := "2.11.7" to the newly created project and it keeps bugging me. I also would prefer sbt generating build.sbt with some default data, but instead it requires me not to forget to create it manually.
Is there any way I could set the single scala version for all projects and sub-projects in a single place? I figured that I could add a separate lazy val commonSettings = Seq { scalaVersion := "2.11.7" } in a root definition. And for each lazy val project definition I should add in the end .settings(commonSettings). This is nice, but still doesn't look beautiful enough - I should do this for every project definition. Is there a better way?
Is there any way I could create a template for a newly created project, so when I just put line lazy val newProject = ..., it will put an appropriate build.sbt file there with the contents I want?
Use
scalaVersion in ThisBuild := "2.11.7"
in the root build.sbt.

How to (automatically) inherit settings/tasks from an sbt plugin?

I have an sbt plugin defining tasks that I would like to have available in a Play project, or another sbt project in general. While it might not be best practice, I'd prefer to have these tasks automatically available in the Play project, so that all I need to do is add the sbt plugin via plugins.sbt. But before I can even get that far, I'm having trouble importing tasks at all.
If the plugin's build.sbt is as follows:
name := "sbt-task-test"
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.10.3"
scalaBinaryVersion := "2.10"
organization := "com.example"
sbtPlugin := true
lazy val testTask = taskKey[Unit]("Run a test task.")
testTask := {
println("Running test task..")
}
How can I make testTask available in another sbt project's build.sbt or Build.scala? I've tried following this example to no avail.
My end goal is to use tasks defined like in this blog post, but I'd like to at least get some simpler examples working first. In this case, I'd be adding something like registerTask("testTask", "com.example.tasks.Test", "Run a test task") to build.sbt, however I have the same problem as above.
First, you should put your task definition in the source of the plugin, not the build.sbt. So try this:
build.sbt of the plugin (it defines only how to build the plugin):
name := "sbt-task-test"
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.10.3"
// scalaBinaryVersion := "2.10" // better not to play with this
organization := "com.example"
sbtPlugin := true
src/main/scala/MyPlugin.scala (in the plugin project)
import sbt._
object MyPlugin extends Plugin {
lazy val testTask = taskKey[Unit]("Run a test task.")
override def settings = Seq(
testTask := { println("Running test task..") }
)
}
Overriding settings helps to add the definition of this task to the project scope.
Now you should build and publish the plugin (locally for example) using sbt publishLocal.
Then in the project, where you want to use this plugin:
project/plugins.sbt should contain:
addSbtPlugin("com.example" % "sbt-task-test" % "1.0.0-SNAPSHOT")
This will add testTask key and definition to the scope automatically, so that you can do in the project's directory:
sbt testTask
and it will print Running test task..