I have the following project structure:
my-project/
build.sbt
...
app/
...
config/
dev/
file1.properties
file2.properties
test/
file1.properties
file2.properties
prod/
file1.properties
file2.properties
The module app contains some scala source code and produces a plain jar file.
The problem is with the config module. What I need to do is to create some configuration in build.sbt that will take each folder from config and put its content into a separate zip file.
The result should be as follows:
my-project-config-dev-1.1.zip ~>
file1.properties
file2.properties
my-project-config-uat-1.1.zip ~>
file1.properties
file2.properties
my-project-config-prod-1.1.zip ~>
file1.properties
file2.properties
1.1 is an arbitrary version of the project.
The configuration should work in such way that when I add new environments and new configuration files, more zip files will be produced. In another task all these zip files should be published to Nexus.
Any suggestions?
I managed to resolve the problem by creating a module config and then a separate sub-module for each environment, so the project structure looks exactly as described in question. It all comes now to proper configuration in build.sbt.
Below is the general idea of what I've done to achieve what I wanted.
lazy val config = (project in file("config")).
enablePlugins(UniversalPlugin).
settings(
name := "my-project",
version := "1.1",
publish in Universal := { }, // disable publishing of config module
publishLocal in Universal := { }
).
aggregate(configDev, configUat, configProd)
lazy val environment = settingKey[String]("Target environment")
lazy val commonSettings = makeDeploymentSettings(Universal, packageBin in Universal, "zip") ++ Seq( // set package format
name := "my-project-config",
version := "1.1",
environment := baseDirectory.value.getName, // set 'environment' variable based on a configuration folder name
topLevelDirectory := None, // set top level directory for each package
packageName in Universal := s"my-project-config-${environment.value}-${version.value}", // set package name (example: my-project-config-dev-1.1)
mappings in Universal ++= contentOf(baseDirectory.value).filterNot { case (_, path) => // do not include target folder
path contains "target"
}
)
lazy val configDev = (project in file("config/dev")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
lazy val configUat = (project in file("config/uat")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
lazy val configProd = (project in file("config/prod")).enablePlugins(UniversalPlugin).settings(commonSettings: _*)
UniversalPlugin is highly configurable, although not all configuration options may be clear at first. I suggest reading its docs and looking at the source code.
To actually package artifacts the following command has be issued:
sbt config/universal:packageBin
Publishing:
sbt config/universal:publish
As can be seen above adding new environments is very easy - only a new folder and one line in build.sbt need to be added.
Related
I'm trying to create a relatively simple sbt plugin to wrap grpc-swagger artifact.
Therefore, I've created a project with the following structure:
projectDir/
build.sbt
lib/grpc-swagger.jar <- the artifact I've downloaded
src/...
where build.sbt looks like the following:
ThisBuild / version := "0.0.1-SNAPSHOT"
ThisBuild / organization := "org.testPlugin"
ThisBuild / organizationName := "testPlugin"
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
According to sbt docs, that's all I have to do in order to include an unmanaged dependecy, that is:
create a lib folder;
store the artifact in there;
However, when I do execute sbt compile publishLocal, the plugin published lacks of that external artifact.
So far I've tried to:
set exportJars := true flag
add Compile / unmanagedJars += file(lib/grpc-swagger.jar") (with also variations of the path)
manual fiddling to libraryDependecies using from file("lib/grpc-swagger.jar") specifier
but none so far seemed to work.
So how am I supposed to add an external artifact to a sbt plugin?
The proper solution to this problem is to publish the grpc-swagger library. If for whatever reason this can't be done from that library's build system, you can do it with sbt. Just add a simple subproject whose only job it is to publish that jar. It should work like so:
...
lazy val `grpc-swagger` = (project in file("."))
.settings(
name := "grpc-swagger",
Compile / packageBin := baseDirectory.value / "lib" / "grpc-swagger.jar",
// maybe other settings, such as grpc-swagger's libraryDependencies
)
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
.dependsOn(`grpc-swagger`)
...
The pom file generated for the root project should now specify a dependency on grpc-swagger, and running the publish task in the grpc-swagger project will publish that jar along with a pom file.
That should be enough to make things work, but honestly, it's still a hack. The proper solution is to fix grpc-swagger's build system so you can publish an artifact from there and then just use it via libraryDependencies.
Using sbt 0.13.5, when opening the project in IntelliJ, there is a warning message
~\myproject\project\Build.scala:5: trait Build in package sbt is
deprecated: Use .sbt format instead
The content of the Build.scala is
import sbt._
object MyBuild extends Build {
lazy val root = Project("MyProject", file("."))
.configs(Configs.all: _*)
.settings(Testing.settings ++ Docs.settings: _*)
}
The Appendix: .scala build definition and the sbt documentation is rather overwhelming.
How to merge my existing Build.scala to build.sbt? Would appreciate any direction to doc/tutorial/examples.
Rename Build.scala to build.sbt and move it up one directory level, so it's at the top rather than inside the project directory.
Then strip out the beginning and end, leaving:
lazy val root = Project("MyProject", file("."))
.configs(Configs.all: _*)
.settings(Testing.settings ++ Docs.settings: _*)
That's the basics.
Then if you want to add more settings, for example:
lazy val root = Project("MyProject", file("."))
.configs(Configs.all: _*)
.settings(
Testing.settings,
Docs.settings,
name := "MyApp",
scalaVersion := "2.11.8"
)
You don't need the :_* thing on sequences of settings anymore in sbt 0.13.13; older versions required it.
The migration guide in the official doc is here: http://www.scala-sbt.org/0.13/docs/Migrating-from-sbt-012x.html#Migrating+from+the+Build+trait
I'm developing a library that includes an sbt plugin. Naturally, I'm using sbt to build this (multi-project) library. My (simplified) project looks as follows:
myProject/ # Top level of library
-> models # One project in the multi-project sbt build.
-> src/main/scala/... # Defines common models for both sbt-plugin and framework
-> sbt-plugin # The sbt plugin build
-> src/main/scala/...
-> framework # The framework. Ideally, the sbt plugin is run as part of
-> src/main/scala/... # compiling this directory.
-> project/ # Multi-project build configuration
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
Note: similar (but simpler) question: How to develop sbt plugin in multi-project build with projects that use it?
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
I have a working example on Github eed3si9n/plugin-bootstrap. It's not super pretty, but it kind of works. We can take advantage of the fact that sbt is recursive.
The project directory is another build inside your build, which knows how to build your build. To distinguish the builds, we sometimes use the term proper build to refer to your build, and meta-build to refer to the build in project. The projects inside the metabuild can do anything any other project can do. Your build definition is an sbt project.
By extension, we can think of the sbt plugins to be library- or inter-project dependencies to the root project of your metabuild.
meta build definition (project/plugins.sbt)
In this example, think of the metabuild as a parallel universe or shadow world that has parallel multi-build structure as the proper build (root, model, sbt-plugin).
To reuse the source code from model and sbt-plugin subprojects in the proper build, we can re-create the multi-project build in the metabuild. This way we don't need to get into the circular dependency.
addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.5")
lazy val metaroot = (project in file(".")).
dependsOn(metaSbtSomething)
lazy val metaModel = (project in file("model")).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "model")
)
lazy val metaSbtSomething = (project in file("sbt-plugin")).
dependsOn(metaModel).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "sbt-plugin")
)
def mirrorScalaSource(baseDirectory: File): Seq[File] = {
val scalaSourceDir = baseDirectory / "src" / "main" / "scala"
if (scalaSourceDir.exists) scalaSourceDir :: Nil
else sys.error(s"Missing source directory: $scalaSourceDir")
}
When sbt loads up, it will build metaModel and metaSbtSomething first, and use metaSbtSomething as a plugin to your proper build.
If you have any other plugins you need you can just add it to project/plugins.sbt normally as I've added sbt-doge.
proper build (build.sbt)
The proper build looks like a normal multi-project build.
As you can see framework subproject uses SomethingPlugin. Important thing is that they share the source code, but the target directory is completely separated, so there are no interference once the proper build is loaded, and you are changing code around.
import Dependencies._
lazy val root = (project in file(".")).
aggregate(model, framework, sbtSomething).
settings(inThisBuild(List(
scalaVersion := scala210,
organization := "com.example"
)),
name := "Something Root"
)
// Defines common models for both sbt-plugin and framework
lazy val model = (project in file("model")).
settings(
name := "Something Model",
crossScalaVersions := Seq(scala211, scala210)
)
// The framework. Ideally, the sbt plugin is run as part of building this.
lazy val framework = (project in file("framework")).
enablePlugins(SomethingPlugin).
dependsOn(model).
settings(
name := "Something Framework",
crossScalaVersions := Seq(scala211, scala210),
// using sbt-something
somethingX := "a"
)
lazy val sbtSomething = (project in file("sbt-plugin")).
dependsOn(model).
settings(
sbtPlugin := true,
name := "sbt-something",
crossScalaVersions := Seq(scala210)
)
demo
In the SomethingPlugin example, I'm defining something task that uses foo.Model.x.
package foo
import sbt._
object SomethingPlugin extends AutoPlugin {
def requries = sbt.plugins.JvmPlugin
object autoImport {
lazy val something = taskKey[Unit]("")
lazy val somethingX = settingKey[String]("")
}
import autoImport._
override def projectSettings = Seq(
something := { println(s"something! ${Model.x}") }
)
}
Here's how we can invoke something task from the build:
Something Root> framework/something
something! 1
[success] Total time: 0 s, completed May 29, 2016 3:01:07 PM
1 comes from foo.Model.x, so this demonstrates that we are using the sbt-something plugin in framework subproject, and that the plugin is using metaModel.
I'm search for the best way way to set up my logging/config in production within my deb file using the sbt-native-packager.
a.) I want to copy my reference.conf and logback.xml from my code repository to /etc/my-app/reference.conf or /etc/my-app/logback.xml. I guess its somehow possible with linuxPackageMappings but i could'nt find a example yet and I'm still struggling to get how SBT and the plugings work together.
b.) I need to tell my jvm that i should use this config and this logback config when started via the created upstart - how do I pass parameters from the build.scala to the jvm-runscript
this is my current project val:
lazy val root = Project(id = appName, base = file("."), settings = JavaServerAppPackaging.settings ++ packageSettings ++ allSettings ++ Project.defaultSettings)
lazy val allSettings = Seq(
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases",
resolvers += "Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
libraryDependencies ++= dependencies)
lazy val packageSettings = packageArchetype.java_server ++ Seq(
bashScriptExtraDefines := Seq("aha"),
version := appVersion,
packageSummary := appName,
packageDescription := appName,
maintainer := appAuthor,
debianPackageDependencies in Debian ++= Seq("openjdk-7-jre-headless"))
thanks
a) For logging output see this question. Configuration input can be done easily with
mappings in Universal <+= (packageBin in Compile, baseDirectory ) map { (_, base) =>
val conf = base / "conf" / "reference.conf"
conf -> "conf/application.conf"
}
By convention the Universal packaging defines config files in the conf folder. For debian this automatically mapped to /etc/your-app/filename
b) Passing parameters to the script is also done via a config file. Use 0.7.0-M3 and follow the instructions here and take a look at the etc-default template
Lots of questions mixed in hear...
a) So to you can install your conf and xml files by including them in your debian package. Building debian packages is not built in to sbt out of the box. You could try https://github.com/sbt/sbt-native-packager but you may be better of dropping out of sbt and just using one of the many normal ways to create debian packages.
Note that you should not be logging to /etc on a linux box. Logs should go under /var
b) you can install an init script that has -D peramiters to tell play where to find its conf and logback.xml files.
$JAVA_HOME/bin/java -Dconfig.file=/etc/foo.comf -Dlogger.file=/etc/logger.xml
c) you should be logging to some directory under /var
You can create directories in the postinst script that is part of the debian package. puppet (or something similar) can be a better way to manage config files on deployed boxes though.
I have a multiproject SBT project, which looks like the example on SBT doc:
import sbt._
import Keys._
object HelloBuild extends Build {
lazy val root = Project(id = "hello",
base = file(".")) aggregate(foo, bar)
lazy val foo = Project(id = "hello-foo",
base = file("foo"))
lazy val bar = Project(id = "hello-bar",
base = file("bar"))
}
Because root is just a virtual project to aggregate both subprojects, I would like to avoid package generation (and artifact publication), but still generate package (and publish) for both subprojects.
Is there an easy way to achieve it ?
Instead of playing whac-a-mole by listing specific tasks to disable (publish, publish-local, publish-signed, etc), another option is to turn off artifact publishing at the source.
publishArtifact := false
Even though there's no publishing happening, I also found I needed to supply a publishTo value to make sbt-pgp's publish-signed task happy. It needs this value, even if it never uses it.
publishTo := Some(Resolver.file("Unused transient repository", file("target/unusedrepo")))
Actually, it is pretty easy. Just override the setting for publish in the root project:
base = file(".")) settings (publish := { }) aggregate(foo, bar)
The following worked for me (this can also be used in other sub projects):
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(foo, bar),
settings = Project.defaultSettings ++ Seq(
publishLocal := {},
publish := {}
)
)
(sbt 0.12.2)
Recent versions of SBT include a setting to skip the publishing phase, as detailed here.
publish / skip := true
It is better to use the publishArtifact setting. It works for every possible way of publishing because they all depend on this setting.
If you need to switch off publishing in a certain project you can do it by providing the project name:
publishArtifact in root := false
There root is the project definition from the original question.
You can put this line anywhere in your build.sbt after defining projects.
To disable the package-related tasks, add
settings(packageBin := { new File("") },
packageSrc := { new File("") },
packageDoc := { new File("") })
to the corresponding Project (root or not). The "weirdness" of this approach is due to packageBin, &c., being of type TaskKey[File]. I use this technique successfully (at root level and in an intermediate aggregation).
For sbt 1.x setting publish := { } did not work for me. Instead, you can use publish / skip:
base = file(".")) settings (publish / skip := true) aggregate(foo, bar)
See https://github.com/sbt/sbt/issues/3136
I am using sbt 1.3 and I tried the different solutions already proposed. My current setup is the standard multi-module sbt project:
lazy val root = (project in file("."))
.settings(
CustomSettings ++ Seq(
Keys.`package` := { new File("") }
)
)
.aggregate(blabla)
And this is the only solution that doesn't generate an empty (root) jar when calling "sbt package". Please note that this only covers partially the question (that was about also the publishing task). If you, like me, need only the packaging, this should be good enough.
Add
publish := false
to subproject (root project) build.sbt to avoid publishing