SBT - How can I add/modify values to application.conf file based on an external source - scala

I read that SBT has functionality to generate source code and resource files.
In my case I want to add/modify a field in an application.conf file during compilation/packaging of the project (leaving the others in place)
For instance my application.conf file has something like:
A {
B = "Some Value"
C = "Some value to be modified"
}
I would like in the SBT to read an external file and change or add the value of A.B or A.C
So if it is possible to do something along the lines of:
build.sbt
lazy val myProject = project.in(file('myproject')
// pseudo code - How do I do this?
.sourceGenerators in Compile += "Read file /path/to/external/file and add or replace the value of application.conf A.B = some external value"

You can replace the values with environment variable values provided while compiling / building your project. For that you'd have to
A {
B = "Some Value"
B = ${?B_ENV}
C = "Some value to be modified"
C = ${?C_ENV}
}
Where B_ENV and C_ENV are the environment variables you set in your terminal either before build or within the build command (before it)
$ B_ENV=1 C_ENV=2 sbt run
Source: https://www.playframework.com/documentation/2.6.x/ProductionConfiguration#using-environment-variables

In this case you can do without sbt and this approach would also work with maven or cradle.
The *.conf support orignates from typesafe config (https://github.com/lightbend/config).
There is a feature to get environment variables to be used in the configuration which should be a good fit to solve the problem.
There are two approaches I would suggest to use
1.) Fail on missing configuration
If configuration of this vallue is important and to prevent the deplyment of misconfigurated application the startup should fail on missing environment variables.
in application.conf
key=${TEST} // expects "TEST" to be set, fails otherwise
2.) Hardcoded value with override
If there is a sensible default behaviour that only in some circumstances should be changed.
in application.conf
key="test" // hardcoded key
key=${?TEST} // override "key" with 3nv "$TEST" value, when it is given

Related

Typesafe Config SBT Multi Module Error When Resolving Placeholder

I'm having a SBT Multi Module project where in some of the sub modules, I'm actually having a place holder to resolve certain configurations. The project structure looks like this:
core
src
main
resources
application.conf
mod1
src
main
resources
application.conf
mod2
src
main
resources
application.conf
In the module2, in my application.conf, I have the following:
# The environment representation of the configurations
# ~~~~~
app {
environment = "test"
name = ${NAME}-split # TODO: Get the default name from application.conf which should also be located here
}
As it can be seen that the NAME is a place holder that I would like to inherit from either the default application.conf that I include or pass it in via a command line argument. As expected, I get to see compiler error like this:
[error] at java.base/java.lang.Thread.run(Thread.java:829)
[error] Caused by: com.typesafe.config.ConfigException$UnresolvedSubstitution: application.test.conf # file:/home/runner/work/housing-price-prediction-data-preparation/housing-price-prediction-data-preparation/split/target/scala-2.12/test-classes/application.test.conf: 7: Could not resolve substitution to a value: ${NAME}
[error] at com.typesafe.config.impl.ConfigReference.resolveSubstitutions(ConfigReference.java:108)
Normally you would put a reference.conf per module and an application.conf at the top application level.
Also note the module level configs are resolved first (in order, first th lower modules than the ones that depend on those). Then the application.conf and finally overrides from the command line.
The order is important, you can't depend on things not yet set, or if they are set to an intem value and then overriden. the dependent config will be based on the one effective when it is resolved.

Merging configurations for spark using typesafe library and extraJavaOptions

I'm trying to merge 2 config file (or create a config file based on a single reference file) using
lazy val finalConfig:
Option(System.getProperty("user.resource"))
.map(ConfigFactory.load)
.map(_.withFallback(ConfigFactory.load(System.getProperty("config.resource"))).resolve())
.getOrElse(ConfigFactory.load(System.getProperty("config.resource")))
I'm defining my java variable inside spark using spark-submit ....... --conf spark.driver.extraJavaOptions=-Dconfig.resource=./reference.conf,-Duser.resource=./user.conf ...
My goal is to be able to point a file that is not inside my jar to be used by System.getProperty("..") in my code. I changed the folder for testing (cd ..) and keep getting the same error so I guess spark doesn't care about my java arguments..?
Is there a way to point to a file (or even 2 files in my case) so that they can be merged?
I also tried to include the reference.conf file but not the user.conf file: it recognizes the reference.conf but not the user.conf that i gave with --conf spark.driver.extraJavaOptions=-Duser.resource=./user.conf .
Is there a way to do that? Thanks if you can help
I don't see you doing ConfigFactory.parseFile to loaded a file containing properties.
Typesafe automatically read any .properties file in the class path, all -D parameters passed in to the JVM and then merges them.
I am reading an external property file which is not part of the jar as following. The file "application.conf" is placed on the same directory where the jar is kept.
val applicationRootPath = System.getProperty("user.dir")
val config = Try {
ConfigFactory.parseFile(new File(applicationRootPath + "/" + "application.conf"))
}.getOrElse(ConfigFactory.empty())
appConfig = config.withFallback(ConfigFactory.load()).resolve
ConfigFactory.load() already contains all the properties present on the properties files in the class path and -d parameters. I am giving priority to my external "application.conf" and falling back on default values. For matching keys "application.conf" take precedence over other sources.

Play framework overriding `application.conf` values based on environment

Play 2.6.x Scala
I have a default application.conf within the folder {project}/conf/ but I'd like to override some values depending on the environment by passing in the respective file as command-line arguments (as detailed in the docs):
sbt run -Dconfig.file=/conf/qa.conf or sbt run -Dconfig.resource=qa.conf
But I'm not able to get play to pick up the overrides. Here's my file directory:
application
|- playApp1
|- playApp2
|-- conf
|-- application.conf
|-- qa.conf
My build.sbt makes playApp2 the default project on load. And I have confirmed that the defulat application.conf is working -- just the override is not.
Thanks for any ideas!
--
Update
Here are the HOCON files play uses. application.conf
platform {
scheme = "http"
host = "localhost:8080"
}
and the overrides as provided in qa.conf
include "application.conf"
platform {
scheme = "https"
host = "ea311.34.com"
}
Your question is about HOCON, in case you did not realize it.
Without seeing your application.conf I can only provide a generic answer. Here is an example of providing a default value for akka.log-config-on-start, which will be overridden by a Java system property or an environment variable called CONFIG_DUMP, if defined:
akka {
log-config-on-start = false
log-config-on-start = ${?CONFIG_DUMP}
}
This feature of HOCON is documented here.
This works if you provide the command line argument first
sbt -Dconfig.resource=qa.conf run

No configuration setting found for key typesafe config

Im trying to implement a configuration tool typesafehub/config
im using this code
val conf = ConfigFactory.load()
val url = conf.getString("add.prefix") + id + "/?" + conf.getString("add.token")
And the location of the property file is /src/main/resources/application.conf
But for some reason i'm receiving
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'add'
File content
add {
token = "access_token=6235uhC9kG05ulDtG8DJDA"
prefix = "https://graph.facebook.com/v2.2/"
limit = "&limit=250"
comments="?pretty=0&limit=250&access_token=69kG05ulDtG8DJDA&filter=stream"
feed="/feed?limit=200&access_token=623501EuhC9kG05ulDtG8DJDA&pretty=0"
}
Everything looks configured correctly ?? do i missed something .
thanks,
miki
The error message is telling you that whatever configuration got read, it didn't include a top level setting named add. The ConfigFactory.load function will attempt to load the configuration from a variety of places. By default it will look for a file named application with a suffix of .conf or .json. It looks for that file as a Java resource on your class path. However, various system properties will override this default behavior.
So, it is likely that what you missed is one of these:
Is it possible that src/main/resources is not on your class path?
Are the config.file, config.resource or config.url properties set?
Is your application.conf file empty?
Do you have an application.conf that would be found earlier in your class path?
Is the key: add defined in the application.conf?
Are you using an IDE or sbt?
I had a similar problem while using Eclipse. It simply did not find the application.conf file at first and later on failed to notice edits.
However, once I ran my program via sbt, all worked just fine, including Eclipse. So, I added 'main/resources' to the libraries (Project -> Properties -> Java Build Path -> Libraries", "add class folder"). That might help you as well.
Place your application.conf in the src folder and it should work
I ran into this issue inside a Specs2 test that was driven by SBT. It turned out that the issue was caused by https://github.com/etorreborre/specs2/issues/556. In that case, the Thread's contextClassLoader wasn't using the correct classloader. If you run into a similar error, there are other versions of ConfigFactory.load() that allow you to pass the current class's ClassLoader instead. If you're using Specs2 and you're seeing this issue, use a version <= 3.8.6 or >= 4.0.1.
Check you path. In my case I got the same issue, having application.conf placed in src/main/resources/configuration/common/application.conf
Incorrect:
val conf = ConfigFactory.load(s"/configuration/common/application.conf")
Correct
val conf = ConfigFactory.load(s"configuration/common/application.conf")
it turned out to be a silly mistake i made.
Following that, i does not matter if you use ":" or "=" in .conf file.
Getting the value from example:
server{
proc {
max = "600"
}
}
conf.getString("server.proc.max")
Even you can have the following conf:
proc {
max = "600"
}
proc {
main = "60000"
}
conf.getString("proc.max") //prints 600
conf.getString("proc.min") //prints 60000
I ran into this doing a getString on an integer in my configuration file.
I ran into exactly the same problem and the solution was to replace = with : in the application.conf. Try with the following content in your application.conf:
add {
token: "access_token=6235uhC9kG05ulDtG8DJDA"
prefix: "https://graph.facebook.com/v2.2/"
limit: "&limit=250"
comments: "?pretty=0&limit=250&access_token=69kG05ulDtG8DJDA&filter=stream"
feed: "/feed?limit=200&access_token=623501EuhC9kG05ulDtG8DJDA&pretty=0"
}
Strangely, IntelliJ doesn't detect any formatting or syntax error when using = for me.
in my case it was a stupid mistake,
i m change file name from "application.config" to "application.conf" and its works .
If the application.conf is not getting discovered, you could add this to build.sbt:
unmanagedSourceDirectories in Compile += baseDirectory.value / "main/resources"
Please don't use this to include any custom path. Follow the guidelines and best-practices
As mentioned by others, make sure the application.conf is place in: src/main/resources.
I placed the file there error went away.
Looking at these examples helped me as well:
https://github.com/lightbend/config/tree/main/examples/scala
Use ConfigFactory.parseFile for other locations

How to change a task's behavior based on how a user calls it, without requiring them to change code?

I'm using sbt-s3 to upload my uberjar artifacts to S3 for deployment. What I need now is for a developer for be able to chose whether they're pushing to staging or production.
sbt-s3 looks something like this,
mappings in upload <<= (name, version, scalaBinaryVersion) map { (name, version, scalaBinaryVersion) =>
Seq(new java.io.File("target/scala-%s/%s.jar" format(scalaBinaryVersion, name)) -> ("%s.jar" format name)) }
Say I wanted to prefix %s.jar with a string like "staging" or "production" depending on how a user called SBT (it seems silly to have them edit Build.scala every time they want to push). For example: sbt s3Upload goes to staging and sbt production:s3Upload or sbt production s3Upload would go to production.
I'm having trouble understanding scopes and how I can use them to solve this problem. I can't just make a setting and hardcore S3Prefix in Production because then they can't push to staging. I didn't have any luck making a production task that overrides a default of "staging" to be "production" for following tasks, either.
Thoughts?
One possible solution might be to use input tasks for that, this is a simple task definition from the the sbt documentation:
demo := {
// get the result of parsing
val args: Seq[String] = spaceDelimited("<arg>").parsed
// Here, we also use the value of the `scalaVersion` setting
println("The current Scala version is " + scalaVersion.value)
println("The arguments to demo were:")
args foreach println
}
Depedending on the argument you can then push to staging or production.
I think there could be a more elegant solution with configurations, but I didn't do that before.