I'm using sbt-native-packager 1.0.0-M5 to create my docker image. I need to add a file that's not a source file or in the resource folder. My docker commands are as follows:
dockerCommands := Seq(
Cmd("FROM", "myrepo/myImage:1.0.0"),
Cmd("COPY", "test.txt keys/"), // <-- The failing part
Cmd("WORKDIR", "/opt/docker"),
Cmd("RUN", "[\"chown\", \"-R\", \"daemon\", \".\"]"),
Cmd("USER", "daemon"),
ExecCmd("CMD", "echo", "Hello, World from Docker")
)
It fails with: msg="test.txt: no such file or directory"
So after digging around a bit it seems I need to have test.txt in target/docker/stage. Then it works. But how do I get it there automatically? The file is actually in the root folder of the project.
I managed to get it to work by adding the file to mappings in Universal. So for you, you would need something like this:
mappings in Universal += file("test.txt") -> "keys/test.txt"
You won't need the COPY command if you do this, by the way.
Now, I'm not sure if this is going to add this mapping to other sbt-native-packager plugins. I hope a commenter can tell me whether or not this is true, but my intuition is that it will do so, which might be a dealbreaker for you. But any workaround is better than none, right? If you use Build.scala you could maybe use a VM argument to tell sbt whether or not to add this mapping...
You may place all additional files (which must be included in container image) into folder src/universal. Content of that folder will be automatically copied in /opt/app folder within your container image. You don't need any additional configuration. See "Getting started with Universal Packaging" for additional info.
The files located in /src/universal will be available in the runtime directory for the Scala app in the Docker container. This means that if your app has /src/universal/example.txt, then it can be accessed with scala.io.Source.fromFile("./example.txt")
I was able to get this working using dockerPackageMappings:
dockerPackageMappings in Docker += (baseDirectory.value / "docker" / "ssh_config") -> "ssh_config"
dockerCommands := (dockerCommands.value match {
case Seq(from#Cmd("FROM", _), rest#_*) =>
Seq(
from,
Cmd("Add", "ssh_config", "/sbin/.ssh/config")
) ++ rest
})
For sbt-docker plugin, not sbt-native-packager
I was able to add files this way:
For example, to add a file located in src/main/resources/docker/some-file.ext
dockerfile in docker := {
val targetPath = "/usr/app"
// map of (relativeName -> File) of all files in resources/docker dir, for convenience
val dockerFiles = {
val resources = (unmanagedResources in Runtime).value
val dockerFilesDir = resources.find(_.getPath.endsWith("/docker")).get
resources.filter(_.getPath.contains("/docker/")).map(r => dockerFilesDir.toURI.relativize(r.toURI).getPath -> r).toMap
}
new Dockerfile {
from(s"$namespace/$baseImageName:$baseImageTag")
...
add(dockerFiles("some-file.ext"), s"$targetPath/some-file.ext")
...
}
}
You can add an entire directory to a Docker image's file system by first making it available using dockerPackageMappings, and then COPYing it as an additional Docker command.
import NativePackagerHelper._
dockerPackageMappings in Docker ++= directory(baseDirectory.value / ".." / "frontend" )
dockerCommands ++= Seq(
Cmd("COPY", "frontend /opt/frontend"),
)
Related
I'm trying to merge 2 config file (or create a config file based on a single reference file) using
lazy val finalConfig:
Option(System.getProperty("user.resource"))
.map(ConfigFactory.load)
.map(_.withFallback(ConfigFactory.load(System.getProperty("config.resource"))).resolve())
.getOrElse(ConfigFactory.load(System.getProperty("config.resource")))
I'm defining my java variable inside spark using spark-submit ....... --conf spark.driver.extraJavaOptions=-Dconfig.resource=./reference.conf,-Duser.resource=./user.conf ...
My goal is to be able to point a file that is not inside my jar to be used by System.getProperty("..") in my code. I changed the folder for testing (cd ..) and keep getting the same error so I guess spark doesn't care about my java arguments..?
Is there a way to point to a file (or even 2 files in my case) so that they can be merged?
I also tried to include the reference.conf file but not the user.conf file: it recognizes the reference.conf but not the user.conf that i gave with --conf spark.driver.extraJavaOptions=-Duser.resource=./user.conf .
Is there a way to do that? Thanks if you can help
I don't see you doing ConfigFactory.parseFile to loaded a file containing properties.
Typesafe automatically read any .properties file in the class path, all -D parameters passed in to the JVM and then merges them.
I am reading an external property file which is not part of the jar as following. The file "application.conf" is placed on the same directory where the jar is kept.
val applicationRootPath = System.getProperty("user.dir")
val config = Try {
ConfigFactory.parseFile(new File(applicationRootPath + "/" + "application.conf"))
}.getOrElse(ConfigFactory.empty())
appConfig = config.withFallback(ConfigFactory.load()).resolve
ConfigFactory.load() already contains all the properties present on the properties files in the class path and -d parameters. I am giving priority to my external "application.conf" and falling back on default values. For matching keys "application.conf" take precedence over other sources.
I have an application that loads configuration from application.conf using ConfigFactory: lazy val myConfig = ConfigFactory.load(pathToConfig)
The application.conf is initially located in src/main/resources
When I deploy my application I want it to load the config from APP_HOME/conf/application.conf
To do so, I excluded the application.conf from the resource folder when building the Rmp and I have added my APP_HOME/conf to the class path.
jar {
exclude '*.conf'
}
and
startScripts {
classpath += files('src/main/resources')
doLast {
def windowsScriptFile = file getWindowsScript()
def unixScriptFile = file getUnixScript()
println('unix script is ' + unixScriptFile.text)
windowsScriptFile.text = windowsScriptFile.text.replace('%APP_HOME%\\lib\\resources', '%APP_HOME%\\conf')
unixScriptFile.text = unixScriptFile.text.replace('\$APP_HOME/lib/resources', '\$APP_HOME/conf')
println('after unix script is ' + unixScriptFile.text)
}
}
The odd thing is that when I modify the $APP_HOME/conf/application.conf and restart the app, the changes are not picked up: ie the old configuration is still being used
Any idea what might cause this or how I can print where the config is being loaded from would be helpful
With many attempts, I got it to work by calling lazy val myConfig = ConfigFactory.load() without specifying the conf file name or path.
Although it solved my issue I still don't understand why calling load with the file name or file path didn't work
I have a multi-project build with a particularly messy module which contain several mainClasses. I would like to create several distribution packages for this messy module, each distribution package employing distinct file sets and employing different formats. Ideas?
This is the answer from the sbt-nativer-packager issue tracker where the same question was posted.
I'm adding this from the gitter chat as well:
I'm just arriving in this chat room and my knowledge of sbt-native-packager is virtually zero... but anyway... looks to me that JavaAppPackaging and other archetypes should actually be configurations extended from Universal. In this scenario, I would just create my own configuration extended from JavaAppPackaging and tweak the necessary bits according to my needs. And, finally, if the plugin just picks mappings in ThisScope... it would pick my own scope, and not JavaAppPackaging... and not Universal.
So, let's go through this one by one.
The sbt-native-packager plugin always pick mappings in Universal. This is not ideal. It should conceptually pick mappings in ThisScope
SBT native packager provides two categories of AutoPlugins: FormatPlugins and ArchetypePlugins. FormatPlugins provide a new package format, e.g. UniversalPlugin (zip, tarball) or DebianPlugins (.deb). These plugins form a a hierarchy as they are build on top of each other:
SbtNativePackager
+
|
|
+-------+ Universal +--------+
| |
| + |
| | |
+ + +
Docker +-+ Linux +-+ Windows
| |
| |
+ +
Debian RPM
mappings, which define a file -> targetpath relation, are inherited with this pattern
mappings in ParentFormatPluginScope := (mappings in FormatPluginScope).value
So for docker it looks like this
mappings in Docker := (mappings in Universal).value
The linux format plugins use specialized mappings to preserve file permissions, but are basically the same.
Since sbt-native-packager plugin always pick mappings in Universal, I have to redefine mappings in Universal in each of my configurations
Yes. If you want to define your own scope and inherit the mappings and change them you have to do this, like all other packaging plugins, too. I recommend putting this code into custom AutoPlugins in your project folder.
For example (not tested, imports may be missing )
import sbt._
object BuilderRSPlugin extends AutoPlugin {
def requires = JavaAppPackaging
object autoImport {
val BuilderRS = config("builderrs") extend Universal
}
import autoImport._
override lazy val projectSettings = Seq(
mappings in BuilderRS := (mappings in Universal).value
)
}
looks to me that JavaAppPackaging and other archetypes should actually be configurations extended from Universal
JavaAppPackaging is an archetype, which means this plugin doesn't bring any new packaging formats, thus no new scopes. It configures all the packaging formats it can and enables them.
You package stuff by specifying the scope:
universal:packageBin
debian:packageBin
windows:packageBin
So if you need to customize your output format you are doing this in the respecting scope.
mappings in Docker := (mappings in Docker).value.filter( /* what ever you want to filter */)
See: https://github.com/sbt/sbt-native-packager/issues/746
IMPORTANT: This is an "answer in progress". IT DOES NOT WORK YET!
This is an example of how one could achieve this.
The basic idea is that we add configurations for different packages to be generated. Each configuration tells which files will be present in the package. This does not work as expected. See my comments after the code.
lazy val BuilderRS = sbt.config("BuilderRS").extend(Compile,Universal)
lazy val BuilderRV = sbt.config("BuilderRV").extend(Compile,Universal)
addCommandAlias("buildRS", "MessyModule/BuilderRS:packageZipTarball")
addCommandAlias("buildRV", "MessyModule/BuilderRV:packageBin") // ideally should be named packageZip
lazy val Star5FunctionalTestSupport =
project
.in(file("MessyModule"))
.enablePlugins(JavaAppPackaging)
.settings((buildSettings): _*)
.configs(Universal,BuilderRS,BuilderRV)
.settings(inConfig(BuilderRS)(
Defaults.configSettings ++ JavaAppPackaging.projectSettings ++
Seq(
executableScriptName := "rs",
mappings in Universal :=
(mappings in Universal).value
.filter {
case (file, name) => ! file.getAbsolutePath.endsWith("/bin/rv")
},
topLevelDirectory in Universal :=
Some(
"ftreports-" +
new java.text.SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new java.util.Date())),
mainClass in ThisScope := Option(mainClassRS))): _*)
//TODO: SEE COMMENTS BELOW ===============================================
// .settings(inConfig(BuilderRV)(
// Defaults.configSettings ++ JavaAppPackaging.projectSettings ++
// Seq(
// packageBin <<= packageBin in Universal,
// executableScriptName := "rv",
// mappings in ThisScope :=
// (mappings in Universal).value
// .filter {
// case (file, name) => ! file.getAbsolutePath.endsWith("/bin/rs")
// },
// topLevelDirectory in Universal :=
// Some(
// "ftviewer-" +
// new java.text.SimpleDateFormat("yyyyMMdd_HHmmss")
// .format(new java.util.Date())),
// mainClass in ThisScope := Option(mainClassRV))): _*)
Now observe configuration BuilderRV which in comments.
It is basically the same thing as configuration BuilderRS, except that we are now deploying a different shell script in the bin folder. There some other small differences, but not relevant to this argumentation. There are two problems:
The sbt-native-packager plugin always pick mappings in Universal. This is not ideal. It should conceptually pick mappings in ThisScope.
Since sbt-native-packager plugin always pick mappings in Universal, I have to redefine mappings in Universal in each of my configurations. And this is a problem because mappings in Universal is defined as a function of itself in all configurations: the result is that we ended up chaining logic to mapppings in Universal each time we redefined it in each configuration. This causes trouble in this example in particular because the configuration BuilderRV (the second one) will perform not only its filter, but also the filter defined in BuilderRS (the first one), which is not what I want.
I have a project that uses the xsbt-web-plugin but I want to be able to customise the jetty settings. Is there a way I can specify my own jetty.xml file? I found the
PluginKeys.configurationFiles
setting and set that to the desired file but it had no effect
You can specify a jetty.xml config file using the config argument to jetty():
jetty(config = "etc/jetty.xml")
If you need to specify multiple config files, you can pass them as individual --config arguments:
jetty(args = Seq
"--config", "jetty.xml"
, "--config", "jetty-https.xml"
, "--config", "jetty-ssl.xml"
))
Check out the docs for more info.
I use xsbt-web-plugin 2.0.2(the latest version) at present.You can do in build.sbt as following:
enablePlugins(JettyPlugin)
containerConfigFile in Compile := Some(file("./src/main/webapp/WEB-INF/jetty-env.xml"))
I hope it can be helpful for you.
Im trying to implement a configuration tool typesafehub/config
im using this code
val conf = ConfigFactory.load()
val url = conf.getString("add.prefix") + id + "/?" + conf.getString("add.token")
And the location of the property file is /src/main/resources/application.conf
But for some reason i'm receiving
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'add'
File content
add {
token = "access_token=6235uhC9kG05ulDtG8DJDA"
prefix = "https://graph.facebook.com/v2.2/"
limit = "&limit=250"
comments="?pretty=0&limit=250&access_token=69kG05ulDtG8DJDA&filter=stream"
feed="/feed?limit=200&access_token=623501EuhC9kG05ulDtG8DJDA&pretty=0"
}
Everything looks configured correctly ?? do i missed something .
thanks,
miki
The error message is telling you that whatever configuration got read, it didn't include a top level setting named add. The ConfigFactory.load function will attempt to load the configuration from a variety of places. By default it will look for a file named application with a suffix of .conf or .json. It looks for that file as a Java resource on your class path. However, various system properties will override this default behavior.
So, it is likely that what you missed is one of these:
Is it possible that src/main/resources is not on your class path?
Are the config.file, config.resource or config.url properties set?
Is your application.conf file empty?
Do you have an application.conf that would be found earlier in your class path?
Is the key: add defined in the application.conf?
Are you using an IDE or sbt?
I had a similar problem while using Eclipse. It simply did not find the application.conf file at first and later on failed to notice edits.
However, once I ran my program via sbt, all worked just fine, including Eclipse. So, I added 'main/resources' to the libraries (Project -> Properties -> Java Build Path -> Libraries", "add class folder"). That might help you as well.
Place your application.conf in the src folder and it should work
I ran into this issue inside a Specs2 test that was driven by SBT. It turned out that the issue was caused by https://github.com/etorreborre/specs2/issues/556. In that case, the Thread's contextClassLoader wasn't using the correct classloader. If you run into a similar error, there are other versions of ConfigFactory.load() that allow you to pass the current class's ClassLoader instead. If you're using Specs2 and you're seeing this issue, use a version <= 3.8.6 or >= 4.0.1.
Check you path. In my case I got the same issue, having application.conf placed in src/main/resources/configuration/common/application.conf
Incorrect:
val conf = ConfigFactory.load(s"/configuration/common/application.conf")
Correct
val conf = ConfigFactory.load(s"configuration/common/application.conf")
it turned out to be a silly mistake i made.
Following that, i does not matter if you use ":" or "=" in .conf file.
Getting the value from example:
server{
proc {
max = "600"
}
}
conf.getString("server.proc.max")
Even you can have the following conf:
proc {
max = "600"
}
proc {
main = "60000"
}
conf.getString("proc.max") //prints 600
conf.getString("proc.min") //prints 60000
I ran into this doing a getString on an integer in my configuration file.
I ran into exactly the same problem and the solution was to replace = with : in the application.conf. Try with the following content in your application.conf:
add {
token: "access_token=6235uhC9kG05ulDtG8DJDA"
prefix: "https://graph.facebook.com/v2.2/"
limit: "&limit=250"
comments: "?pretty=0&limit=250&access_token=69kG05ulDtG8DJDA&filter=stream"
feed: "/feed?limit=200&access_token=623501EuhC9kG05ulDtG8DJDA&pretty=0"
}
Strangely, IntelliJ doesn't detect any formatting or syntax error when using = for me.
in my case it was a stupid mistake,
i m change file name from "application.config" to "application.conf" and its works .
If the application.conf is not getting discovered, you could add this to build.sbt:
unmanagedSourceDirectories in Compile += baseDirectory.value / "main/resources"
Please don't use this to include any custom path. Follow the guidelines and best-practices
As mentioned by others, make sure the application.conf is place in: src/main/resources.
I placed the file there error went away.
Looking at these examples helped me as well:
https://github.com/lightbend/config/tree/main/examples/scala
Use ConfigFactory.parseFile for other locations