I have an sbt project. I have defined the packageName as follows in the build.sbt
packageName in Universal := "project"
Is there a way to override packageName when we do sbt dist in commmand line?
Something like:
sbt 'set packageName := "newName"' publish # or
sbt 'set packageName in Universal := "newName"' publish
?
Custom commands can be used to modify the build state like so
commands += Command.command("distWithPackageNameOverride") { state =>
"""set packageName in Universal := "foo"""" :: "dist" :: state
}
where executing sbt distWithPackageNameOverride should output foo.zip under
yourapp/target/universal/foo.zip
Related
I have a project that has the following build.sbt:
addCommandAlias("package", "dist")
lazy val actual = (project in file("."))
.enablePlugins(UniversalPlugin, JavaServerAppPackaging)
.settings(
name := "DeployerPod",
mainClass := Some("com.myself.executable.Runner"),
Compile / mainClass := Some("com.myself.executable.Runner"),
Compile / run / mainClass := Some("com.myself.utils.Pipeline"),
Universal / mainClass := Some("com.myself.executable.Runner"),
Universal / compile / mainClass := Some("com.myself.executable.Runner"),
)
We have a CICD which runs a Dockerfile.
There I have sbt run as one of the steps, which will execute com.myself.utils.Pipeline class to run a Scala class and do the pre requisites for the pipeline.
As one of the last sbt based steps, I'm also running sbt package, which eventually runs an sbt dist command. At this point, inside the extracted ZIP's bin folder, I see two BAT files corresponding to the two main classes. Unfortunately I only want the Runner class BAT instead of Pipeline BAT.
For this I tried running sbt package -main com.myself.executable.Runner but that failed saying Not a valid command: -
Is there a way I can specify the mainClass only for this Universal plugin somehow? Because the way I've tried in my build.sbt doesn't seem to work.
I have the following project structure:
my-project/
build.sbt
...
app/
...
config/
dev/
file1.properties
file2.properties
test/
file1.properties
file2.properties
prod/
file1.properties
file2.properties
The module app contains some scala source code and produces a plain jar file.
The problem is with the config module. What I need to do is to create some configuration in build.sbt that will take each folder from config and put its content into a separate zip file.
The result should be as follows:
my-project-config-dev-1.1.zip ~>
file1.properties
file2.properties
my-project-config-uat-1.1.zip ~>
file1.properties
file2.properties
my-project-config-prod-1.1.zip ~>
file1.properties
file2.properties
1.1 is an arbitrary version of the project.
The configuration should work in such way that when I add new environments and new configuration files, more zip files will be produced. In another task all these zip files should be published to Nexus.
Any suggestions?
I managed to resolve the problem by creating a module config and then a separate sub-module for each environment, so the project structure looks exactly as described in question. It all comes now to proper configuration in build.sbt.
Below is the general idea of what I've done to achieve what I wanted.
lazy val config = (project in file("config")).
enablePlugins(UniversalPlugin).
settings(
name := "my-project",
version := "1.1",
publish in Universal := { }, // disable publishing of config module
publishLocal in Universal := { }
).
aggregate(configDev, configUat, configProd)
lazy val environment = settingKey[String]("Target environment")
lazy val commonSettings = makeDeploymentSettings(Universal, packageBin in Universal, "zip") ++ Seq( // set package format
name := "my-project-config",
version := "1.1",
environment := baseDirectory.value.getName, // set 'environment' variable based on a configuration folder name
topLevelDirectory := None, // set top level directory for each package
packageName in Universal := s"my-project-config-${environment.value}-${version.value}", // set package name (example: my-project-config-dev-1.1)
mappings in Universal ++= contentOf(baseDirectory.value).filterNot { case (_, path) => // do not include target folder
path contains "target"
}
)
lazy val configDev = (project in file("config/dev")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
lazy val configUat = (project in file("config/uat")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
lazy val configProd = (project in file("config/prod")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
UniversalPlugin is highly configurable, although not all configuration options may be clear at first. I suggest reading its docs and looking at the source code.
To actually package artifacts the following command has be issued:
sbt config/universal:packageBin
Publishing:
sbt config/universal:publish
As can be seen above adding new environments is very easy - only a new folder and one line in build.sbt need to be added.
I am using TeamCity to run a bash script that is utilizing SBT Native Packager to publish an image to Docker. The sbt portion of the bash script looks something like this:
sbt -DdockerRepository=$repo -DpackageName=$packageName -D myproject/docker:publish
I want to pass on the TeamCity build number as a version number to my package. Today I specify the version number manually in settings in build.sbt:
settings(
version := "0.20",
....,
dockerBaseImage := "example.com:5000/linux/java8:latest",
dockerRepository in Docker := Some("example.com/myoldrepo"),
dockerUpdateLatest := true'
)
I want to be able to do it like this:
activator -Dversion=0.21 -DpackageName=myproject -D myproject/docker:publish
but this does not seem to work. Yet overriding the dockerRepository like I do above is working.
How can I pass my desired version number into SBT from the command line/TeamCity?
You could set version before publish:
sbt 'set version := "1.0"' docker:publish
Try something like this:
val myVersion = util.Properties.propOrNone("version").getOrElse("0.20")
val myDockerBaseImage = util.Properties.propOrNone("dockerBaseImage").
getOrElse("example.com:5000/linux/java8:latest")
lazy val myProject = Project("myProject",file("path")).settings(
version := myVersion,
dockerBaseImage := myDockerBaseImage,
....,
dockerRepository in Docker := Some("example.com/myoldrepo"),
dockerUpdateLatest := true
)
And then call it (depends on your sbt installation):
SBT_OPTS="-Dversion=0.21" sbt
sbt -Dversion=0.21
activator -Dversion=0.21
I have multi-project Build.scala. Is there a way to place all jars generated by sbt-assembly in the root target directory?
For example, consider the following:
lazy val root = Project("root", file(".")).aggregate(hello)
lazy val hello = Project(id = "hello", base = file("hello"))
.settings(assemblySettings: _*)
As is, if I run sbt assembly, hello.jar would be placed in hello/target/<scala-version>/. Is possible instead to place it in /target/<scala-version>/?
I know it's possible to specify the outputPath I want by adding the following setting:
target in assembly := file("target/scala-2.11/")
Is there any way to make this more generic? For example, so it is not necessary to manually specify the scala version?
assemblyOutputPath in assembly := file("yourpath")
A small improvement on this answer. If you need to retain the file name that is generated by assembly plugin do it as below:
assembly / assemblyOutputPath := file(s"/path/to/jar/${(assembly/assemblyJarName).value}")
You can set assemblyOutputPath via cmd:
sbt 'set assemblyOutputPath in assembly := new File("/path/to/package.jar")' assembly
In case you need to set multiple options - just use spaces:
sbt 'set test in assembly := {}' 'set assemblyOutputPath in assembly := new File("/path/to/package.jar")' assembly
I have an sbt plugin defining tasks that I would like to have available in a Play project, or another sbt project in general. While it might not be best practice, I'd prefer to have these tasks automatically available in the Play project, so that all I need to do is add the sbt plugin via plugins.sbt. But before I can even get that far, I'm having trouble importing tasks at all.
If the plugin's build.sbt is as follows:
name := "sbt-task-test"
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.10.3"
scalaBinaryVersion := "2.10"
organization := "com.example"
sbtPlugin := true
lazy val testTask = taskKey[Unit]("Run a test task.")
testTask := {
println("Running test task..")
}
How can I make testTask available in another sbt project's build.sbt or Build.scala? I've tried following this example to no avail.
My end goal is to use tasks defined like in this blog post, but I'd like to at least get some simpler examples working first. In this case, I'd be adding something like registerTask("testTask", "com.example.tasks.Test", "Run a test task") to build.sbt, however I have the same problem as above.
First, you should put your task definition in the source of the plugin, not the build.sbt. So try this:
build.sbt of the plugin (it defines only how to build the plugin):
name := "sbt-task-test"
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.10.3"
// scalaBinaryVersion := "2.10" // better not to play with this
organization := "com.example"
sbtPlugin := true
src/main/scala/MyPlugin.scala (in the plugin project)
import sbt._
object MyPlugin extends Plugin {
lazy val testTask = taskKey[Unit]("Run a test task.")
override def settings = Seq(
testTask := { println("Running test task..") }
)
}
Overriding settings helps to add the definition of this task to the project scope.
Now you should build and publish the plugin (locally for example) using sbt publishLocal.
Then in the project, where you want to use this plugin:
project/plugins.sbt should contain:
addSbtPlugin("com.example" % "sbt-task-test" % "1.0.0-SNAPSHOT")
This will add testTask key and definition to the scope automatically, so that you can do in the project's directory:
sbt testTask
and it will print Running test task..