config directory when using sbt-native-packager - scala

I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf (instead of really putting files there and somehow specifying in the app where to look for them)?
In particular how does it influence update/uninstall+install process? Are the configs somehow preserved (for example for debian with java_server architecture setting)?

I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf
Keeping everything in one place. You have your application directory which contains everything and then you just link from the OS-specific folders to the according directories in your application folder.
Are the configs somehow preserved
Yes indeed. You can try it out with a simple play application. Add this to your build.sbt
mappings in Universal <+= (packageBin in Compile, baseDirectory ) map { (_, base) =>
val conf = base / "conf" / "application.conf"
conf -> "conf/application.conf"
}
This will map your application.conf in the conf folder. When you build a debian package with
debian:packageBin
you can see in target/-/DEBIAN/conffiles an entry
/usr/share/<app-name>/conf/application.conf
An apt-get remove your-app won't remove this file, only a purge

Related

SBT native packager does not copy file to Docker image

I have an Akka SBT project. It consists of multiple SBT submodules:
common contains resources/logback-prod.xml
app depends on two other submodules
postgres
I build a docker image of the project. The only command which does not affect the image is:
...
dockerPackageMappings in Docker +=
((resourceDirectory in Compile).value / "logback-prod.xml") -> "/opt/docker/conf/logback-prod.xml"
...
This line of code does not copy the logback-prod.xml from the common subproject to the docker /opt/docker/conf/ path
Instead it creates a directory with name "logback-prod.xml" by the path I mentioned above.
What I'm doing wrong?
Usually when you get an empty directory instead of the file in Docker, it means that the source file doesn't exist. I.e. in your case it could mean that your source path is wrong. Try it with copying a file with an absolute path, and see if it gets copied in correctly.
The problem is solved by this line of code:
mappings in Docker += (resourceDirectory in common in Compile).value / "logback-prod.xml" -> "/opt/docker/conf/logback-prod.xml",

Scala Standalone JAR with a conf Folder

I'm using the sbt assembly jar plugin to create a standalone jar file. My project folder structure would look like this:
MyProject
-src
- main
- scala
- mypackages and source files
- conf // contains application.conf, application.test.conf and so on
- test
-project // contains all the build related files
- README.md
I now want to be able to run the fat jar that I produce against a version of the application.conf that I specify as a System property!
So here is what I do in my unit test!
System.setProperty("environment", "test")
And this is how I load the config in one of the files in my src folder:
val someEnv = Option(System.getProperty("environment", "")).filter(_.nonEmpty) // gives me some(test)
val name = s"application.${someEnv.get}.conf"
I can see that the environment variable is set and I get the environment passed it. But later on I load the application.test.conf as below:
ConfigFactory.load(name).resolve()
It however loads just the edfault application.conf and not the one that I specify!
What is wrong in my case? Where should I put the conf folder? I'm trying to run it against my unit test which is inside the test folder!
I believe you need to specify the full name of the configuration file. The .conf is optional. Try
ConfigFactory.load(s"application.${someEnv.get}").resolve()
The docs for ConfigFactory.load(String) indicate you need to supply
name (optionally without extension) of a resource on classpath
Ok! Here is what I had to do! Change the name of the folder where the config file is located. I originally had it as conf and I had to rename it to resources and bang it worked!

Add specific directory and its content to Universal target

I am switching from maven to sbt for a Scala project I am working on. I used to work with the maven assembly plugin where you can map any directory in the workspace to a target directory in the assembly. I didn't find any equivalent in sbt-native-package, it worth provide this feature for the Universe kind.
I understood that everything that is present in the universal subdirectory is copied to the package as such, and it works like a charm, but I lack something like the following snippet.
mappings in Universal += {
directory("my/local/dir") -> "static/dirInPackage"
}
I would like to know if there is already a way to do that, in such case, I would be happy to know how to do it, and I propose my help to commit documentation for that part if you want.
If there is no way to do this kind of customization, I will be happy to propose a patch for that after having discussed specifications.
By the way, great job, your packager is working very well, thanks !
After having discussed with the sbt-native-manager team and a first "rejected" pull request, here is the way to do this directory mapping in the build.sbt file (see pull request https://github.com/sbt/sbt-native-packager/pull/160 which provides mode detailed documentation) :
mappings in Universal <++= (packageBin in Compile, target ) map { (_, target) =>
val dir = target / "scala-2.10" / "api"
(dir.***) pair relativeTo(dir.getParentFile)
}
To reduce verbosity of the above snippet, there is an issue (https://github.com/sbt/sbt-native-packager/issues/161) to propose a more human readable way to express this directory mapping:
mappings in Universal ++= allFilesRelativeTo(file(target / "scala-2.10" / "api"))
From https://github.com/sbt/sbt-native-packager
If you'd like to add additional files to the installation dir, simply add them to the universal mappings:
import com.typesafe.sbt.SbtNativePackager.Universal
mappings in Universal += {
file("my/local/conffile") -> "conf/my.conf"
}
You could use a simple map on top of the directory method result.
==> directory method documentation: MappingsHelper.directory
For example:
// Packaging the content of /src/main/resources under conf add the following:
mappings in Universal ++= (directory("src/main/resources").map(t => (t._1, t._2.replace("resources", "conf"))))
This one seems to be the simplest example that worked for me
Takes all files in res/scripts/ and puts it in the bin/ directory when unzipped.
// In build.sbt
mappings in Universal <++= (packageBin in Compile) map { jar =>
val scriptsDir = new java.io.File("res/scripts/")
scriptsDir.listFiles.toSeq.map { f =>
f -> ("bin/" + f.getName)
}
}
If you choose a file that's not created, it will be created for you, for example assets/ will make a new assets folder with the files. If you want files inside of this one using this approach you'll have to make a new Seq at least that's what I did. Here's my example
assets/
├── scripts
│   └── install_dependencies.sh
└── urbangrizzly.database
and the appropriate build.sbt section:
mappings in Universal <++= (packageBin in Compile) map { jar =>
val assetsDir = new java.io.File("assets/")
val scriptsDir = new java.io.File("assets/scripts")
assetsDir.listFiles.toSeq.map { files =>
files -> ("assets/" + files.getName)
} ++ scriptsDir.listFiles.toSeq.map { files =>
files -> ("assets/scripts/" + files.getName)
}
}
If you need more, just keep using the ++ operator to concatenate the lists

Modifying install directory for rpm with sbt native-packager

I am trying to build an rpm package with the sbt-native-packager that installs in a custom directory eg /opt/myapp instead of /usr - due to in-house policy requirements.
I have a build.sbt that will build a standard rpm but I'm stumped when it comes to modifying the directory. My apologies-I'm quite new to scala, sbt and the native pacakger.
I am using mapGenericFilesToLinux and would like to keep its structure - just modifying the first part of the destination directory.
I found this code fragment in a git hub issue https://github.com/sbt/sbt-native-packager/issues/4#issuecomment-6731183
linuxPackageMappings <+= target map { target =>
val src = target / "webapp"
val dest = "/opt/app"
LinuxPackageMapping(
for {
path <- (src ***).get
if !path.isDirectory
} yield path -> path.toString.replaceFirst(src.toString, dest)
)
}
I believe I want to do something similar except to
linuxPackageMappings in Rpm <++= <SOMETHING HERE> {
// for loop that steps through the source and destination and modifies the directory
}
thanks in advance for any help
bye
Pam
So, in sbt 0.12 you need to make sure you specify all the dependent keys you want to use before declaring the value you want. SO, let's pretend two things:
linuxPackageMappings has all your mappings for the packaging.
linuxPackageMappings in Rpm has nothing added to it directly.
We're going to take the value in linuxPackageMappings and alter the directory for linuxPackageMappings in Rpm:
linuxPackageMappings in Rpm <<= (linuxPackageMappings) map { mappings =>
// Let's loop through the mappings and alter their on-disc location....
for(LinuxPackageMapping(filesAndNames, meta, zipped) <- mappings) yield {
val newFilesAndNames = for {
(file, installPath) <- filesAndNames
} yield file -> installPath.replaceFirst("/usr/share/app", "/opt/app")
LinuxPackageMapping(newFilesAndNames, meta, zipped)
}
}
What this does is rip out the linux package mappings (which include whether or not to gzip files, and the user/group owners/permissions) and modify the install path of each file.
Hope that helps! IN sbt-native-packager.NEXT (not released) you can configure the default install location.

Custom deployment with sbt

I have a Scala application and i want to setup a deployment process result similar to one Akka sbt plugin gives, but i it requires project to be a microkernel. Sbt-assembly is a very cool plugin, but i want to have a free access to my config files, basically i want to have the following app structure:
/app/bin - start script bash file
/config - all my application .conf files
/deploy - my project .jar with classes
/lib - all my dependencies .jar files
/logs - log files
I we checked typesafe sbt-native-packager, it's better, but it could place my .conf file and logs out of jars. All those settings in SBT looks confusing to me, what can i do to with this?
Actually this is not hard to update akka-sbt-plugin to make it work in general, add this file to your project folder and somewhere in your build the following settings:
.settings(Distribution.distSettings: _*)
.settings(mappings in Compile in packageBin ~= { _.filter(!_._1.getName.endsWith(".conf")) })
The first line adds all dist settings to your project and the second one excludes all .conf files from your .jar and reads them from config folder.
Now you have to commands: dist - creates a folder with a structure you've discribed and zipDist which packs it all into .zip file, later you can add this to you publish setting:
addArtifact(Artifact(someName, "zip", "zip"), zipDist)