I have a Scala application and i want to setup a deployment process result similar to one Akka sbt plugin gives, but i it requires project to be a microkernel. Sbt-assembly is a very cool plugin, but i want to have a free access to my config files, basically i want to have the following app structure:
/app/bin - start script bash file
/config - all my application .conf files
/deploy - my project .jar with classes
/lib - all my dependencies .jar files
/logs - log files
I we checked typesafe sbt-native-packager, it's better, but it could place my .conf file and logs out of jars. All those settings in SBT looks confusing to me, what can i do to with this?
Actually this is not hard to update akka-sbt-plugin to make it work in general, add this file to your project folder and somewhere in your build the following settings:
.settings(Distribution.distSettings: _*)
.settings(mappings in Compile in packageBin ~= { _.filter(!_._1.getName.endsWith(".conf")) })
The first line adds all dist settings to your project and the second one excludes all .conf files from your .jar and reads them from config folder.
Now you have to commands: dist - creates a folder with a structure you've discribed and zipDist which packs it all into .zip file, later you can add this to you publish setting:
addArtifact(Artifact(someName, "zip", "zip"), zipDist)
Related
With my sbt - play/scala application, I have been using sbt run while developing.
Almost finished my project, and now I want to sbt dist for production purposes.
(Correct me if this is a bad idea.)
My question is here.
With my sbt run, I had the access to unmanaged resources by adding
unmanagedResourceDirectories in Assets += baseDirectory.value / "works"
to my build.sbt
However, after sbt dist the same url no longer works and sends me 404 not found error.
Not Found For request 'GET /assets/RAW/abc.png'
This "works" folder includes files that will be generated during the service, which is separate directory from the usual "public" folder.
And This is my routes FYI.
GET /assets/*file controllers.Assets.at(path="/public", file)
GET /works/*file controllers.Assets.at(path="/works/", file)
Does sbt-dist requires any additional code in build.sbt or should I fix anything?
Additional asset directories specified via unmanagedResourceDirectories
unmanagedResourceDirectories in Assets += baseDirectory.value / "works"
will also be served from public according to docs:
A nuance with sbt-web is that all assets are served from the public
folder... note that the files there will be aggregated into the target public folder
This means you need to change GET /works route from
GET /works/*file controllers.Assets.at(path="/works/", file)
to
GET /works/*file controllers.Assets.at(path="/public", file)
Additional assets should now be accessible at both
http://example.com/assets/RAW/abc.png
http://example.com/works/RAW/abc.png
You can confirm that additional assets end up under public after sbt dist by unzipping the generated package at .../target/universal, and then listing the contents of a jar under lib directory ending with -assets.jar, for example:
jar tf target/universal/play-scala-starter-example-1.0-SNAPSHOT/play-scala-starter-example.play-scala-starter-example-1.0-SNAPSHOT-assets.jar
Ok, so here is how I solved my problem.
After hours(probably days) of research, I was able to get an access to external files by adding a new function in controllers, sendfile(new file(filename), inline=true)
I also had to set routes. In my case,
GET /download/:id/:name controllers.DownloadTask.download(id: String, name: String)
After adding this url/routes info in the client, it worked well.
I'm using the sbt assembly jar plugin to create a standalone jar file. My project folder structure would look like this:
MyProject
-src
- main
- scala
- mypackages and source files
- conf // contains application.conf, application.test.conf and so on
- test
-project // contains all the build related files
- README.md
I now want to be able to run the fat jar that I produce against a version of the application.conf that I specify as a System property!
So here is what I do in my unit test!
System.setProperty("environment", "test")
And this is how I load the config in one of the files in my src folder:
val someEnv = Option(System.getProperty("environment", "")).filter(_.nonEmpty) // gives me some(test)
val name = s"application.${someEnv.get}.conf"
I can see that the environment variable is set and I get the environment passed it. But later on I load the application.test.conf as below:
ConfigFactory.load(name).resolve()
It however loads just the edfault application.conf and not the one that I specify!
What is wrong in my case? Where should I put the conf folder? I'm trying to run it against my unit test which is inside the test folder!
I believe you need to specify the full name of the configuration file. The .conf is optional. Try
ConfigFactory.load(s"application.${someEnv.get}").resolve()
The docs for ConfigFactory.load(String) indicate you need to supply
name (optionally without extension) of a resource on classpath
Ok! Here is what I had to do! Change the name of the folder where the config file is located. I originally had it as conf and I had to rename it to resources and bang it worked!
When I run
sbt docker:publishLocal # or stage
I want to add a file or folder to the output folder that it pushes to code to, ie:
/target/docker/
How can I do this?
I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf (instead of really putting files there and somehow specifying in the app where to look for them)?
In particular how does it influence update/uninstall+install process? Are the configs somehow preserved (for example for debian with java_server architecture setting)?
I'd like to ask about reasoning behind the fact, that the sbt-native-packager plugin creates a symlink /etc/ -> /usr/share//conf
Keeping everything in one place. You have your application directory which contains everything and then you just link from the OS-specific folders to the according directories in your application folder.
Are the configs somehow preserved
Yes indeed. You can try it out with a simple play application. Add this to your build.sbt
mappings in Universal <+= (packageBin in Compile, baseDirectory ) map { (_, base) =>
val conf = base / "conf" / "application.conf"
conf -> "conf/application.conf"
}
This will map your application.conf in the conf folder. When you build a debian package with
debian:packageBin
you can see in target/-/DEBIAN/conffiles an entry
/usr/share/<app-name>/conf/application.conf
An apt-get remove your-app won't remove this file, only a purge
I recently decided to use SBT to build an existing project.
In this project I have some .glsl files within the scala packages which I need to copy during the compilation phase.
The project is structured like this :
- myapp.opengl
- Shader.scala
- myapp.opengl.shaders
- vertex_shader.glsl
- fragment_shader.glsl
Is this file structure correct for SBT or do I need to put the .glsl files into an other directory. And do you know a clean way to copy these files into the target folder ?
I would prefer not putting these files into the resources directory since they are (non-compiled) sources files
Thanks
I would not recommend putting those files into src/main/scala as they do not belong there. If you want to keep them separate from your resource files, you can put them in a custom path, e.g. src/main/glsl and add the following lines to your project definition to have them copied into output directory:
val shaderSourcePath = "src"/"main"/"glsl"
// use shaderSourcePath as root path, so directory structure is
// correctly preserved (relative to the source path)
def shaderSources = (shaderSourcePath ##) ** "*.glsl"
override def mainResources = super.mainResources +++ shaderSources