I have an Akka SBT project. It consists of multiple SBT submodules:
common contains resources/logback-prod.xml
app depends on two other submodules
postgres
I build a docker image of the project. The only command which does not affect the image is:
...
dockerPackageMappings in Docker +=
((resourceDirectory in Compile).value / "logback-prod.xml") -> "/opt/docker/conf/logback-prod.xml"
...
This line of code does not copy the logback-prod.xml from the common subproject to the docker /opt/docker/conf/ path
Instead it creates a directory with name "logback-prod.xml" by the path I mentioned above.
What I'm doing wrong?
Usually when you get an empty directory instead of the file in Docker, it means that the source file doesn't exist. I.e. in your case it could mean that your source path is wrong. Try it with copying a file with an absolute path, and see if it gets copied in correctly.
The problem is solved by this line of code:
mappings in Docker += (resourceDirectory in common in Compile).value / "logback-prod.xml" -> "/opt/docker/conf/logback-prod.xml",
Related
im new to scala and sbt and im building a project and reading a model file (.h5) using the org.nd4j.linalg.io.ClassPathResource class. The h5 file is located under the resources folder and it works well when running the project locally using Intelij.
However, when the app runs on the docker i get the following error
java.io.FileNotFoundException: class path resource [scala/Product.class] cannot be opened because it does not exist
when i print the path which the app looking for i see that it is the root of my jar(when local), and root root folder of the machine in the docker environment.
i understand that whatever runs my app in the docker can't access the resources inside my jar.
My question is how do i use resources without changing the code before i deployment.
I have tried adding the following to the build.sbt file
mappings.in(Universal) += {((baseDirectory.value / "main" / "resources" / "my_model.h5"), "my_model.h5")}
I'm using the sbt assembly jar plugin to create a standalone jar file. My project folder structure would look like this:
MyProject
-src
- main
- scala
- mypackages and source files
- conf // contains application.conf, application.test.conf and so on
- test
-project // contains all the build related files
- README.md
I now want to be able to run the fat jar that I produce against a version of the application.conf that I specify as a System property!
So here is what I do in my unit test!
System.setProperty("environment", "test")
And this is how I load the config in one of the files in my src folder:
val someEnv = Option(System.getProperty("environment", "")).filter(_.nonEmpty) // gives me some(test)
val name = s"application.${someEnv.get}.conf"
I can see that the environment variable is set and I get the environment passed it. But later on I load the application.test.conf as below:
ConfigFactory.load(name).resolve()
It however loads just the edfault application.conf and not the one that I specify!
What is wrong in my case? Where should I put the conf folder? I'm trying to run it against my unit test which is inside the test folder!
I believe you need to specify the full name of the configuration file. The .conf is optional. Try
ConfigFactory.load(s"application.${someEnv.get}").resolve()
The docs for ConfigFactory.load(String) indicate you need to supply
name (optionally without extension) of a resource on classpath
Ok! Here is what I had to do! Change the name of the folder where the config file is located. I originally had it as conf and I had to rename it to resources and bang it worked!
I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf (instead of really putting files there and somehow specifying in the app where to look for them)?
In particular how does it influence update/uninstall+install process? Are the configs somehow preserved (for example for debian with java_server architecture setting)?
I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf
Keeping everything in one place. You have your application directory which contains everything and then you just link from the OS-specific folders to the according directories in your application folder.
Are the configs somehow preserved
Yes indeed. You can try it out with a simple play application. Add this to your build.sbt
mappings in Universal <+= (packageBin in Compile, baseDirectory ) map { (_, base) =>
val conf = base / "conf" / "application.conf"
conf -> "conf/application.conf"
}
This will map your application.conf in the conf folder. When you build a debian package with
debian:packageBin
you can see in target/-/DEBIAN/conffiles an entry
/usr/share/<app-name>/conf/application.conf
An apt-get remove your-app won't remove this file, only a purge
I have a Scala application and i want to setup a deployment process result similar to one Akka sbt plugin gives, but i it requires project to be a microkernel. Sbt-assembly is a very cool plugin, but i want to have a free access to my config files, basically i want to have the following app structure:
/app/bin - start script bash file
/config - all my application .conf files
/deploy - my project .jar with classes
/lib - all my dependencies .jar files
/logs - log files
I we checked typesafe sbt-native-packager, it's better, but it could place my .conf file and logs out of jars. All those settings in SBT looks confusing to me, what can i do to with this?
Actually this is not hard to update akka-sbt-plugin to make it work in general, add this file to your project folder and somewhere in your build the following settings:
.settings(Distribution.distSettings: _*)
.settings(mappings in Compile in packageBin ~= { _.filter(!_._1.getName.endsWith(".conf")) })
The first line adds all dist settings to your project and the second one excludes all .conf files from your .jar and reads them from config folder.
Now you have to commands: dist - creates a folder with a structure you've discribed and zipDist which packs it all into .zip file, later you can add this to you publish setting:
addArtifact(Artifact(someName, "zip", "zip"), zipDist)
I recently decided to use SBT to build an existing project.
In this project I have some .glsl files within the scala packages which I need to copy during the compilation phase.
The project is structured like this :
- myapp.opengl
- Shader.scala
- myapp.opengl.shaders
- vertex_shader.glsl
- fragment_shader.glsl
Is this file structure correct for SBT or do I need to put the .glsl files into an other directory. And do you know a clean way to copy these files into the target folder ?
I would prefer not putting these files into the resources directory since they are (non-compiled) sources files
Thanks
I would not recommend putting those files into src/main/scala as they do not belong there. If you want to keep them separate from your resource files, you can put them in a custom path, e.g. src/main/glsl and add the following lines to your project definition to have them copied into output directory:
val shaderSourcePath = "src"/"main"/"glsl"
// use shaderSourcePath as root path, so directory structure is
// correctly preserved (relative to the source path)
def shaderSources = (shaderSourcePath ##) ** "*.glsl"
override def mainResources = super.mainResources +++ shaderSources