How to read system variable into a conf file in Play framework - scala

I would read an environment variable like this below
my.key = ${?MY_KEY_ENV}
But how to read a system variable that's passed in via
-Dmysystem.var=XXX
It is not being resolved in my conf file

assuming your project is managed via SBT. make sure you have the following set in the build file
javaOptions in Global += "-Dmysystem.var=XXX"
and your application.conf file has the following
my_key=${mysystem.var}
and now you should be able to be refer the my_key using the below code
configuration.getString("my_key")
tested this in my play app and it is working as expected.

Related

build.sbt and application.conf are not reading environment variables set by .env

I'm attepmting to build a container of a Play Framework application. For this I'm using the sbt-native-packager plugin and using the command 'sbt clean docker:publishLocal'.
To simplify future pipelines and the dev environments, I enable the sbt-dotenv plugin to use a .env file where I can define my variables. The variables are being used in build.sbt and application.conf
build.sbt
name := sys.env.get("APP_NAME")
organization := sys.props.get("CLIENT_NAME")
version := sys.props.get("APP_VSN")
application.conf uses them as follows, to allow for defaults
default.username = username
default.username = ${?DB_USERNAME}
default.password = password
default.password = ${?DB_PASSWD}
When running sbt clean docker:publishLocal, I can see the plugin loading the variables but the configuration files can never find them and always fallback to defaults. Since my understanding of Docker is a bit limited, I thought that maybe I needed to pass the envVars to Docker as follows:
dockerEnvVars := Map("APP_NAME" -> sys.env.get("APP_NAME").....
But this is not working either, they are present in the final container but I need them in the building process so the configuration files can build the distributable with them.
I don't really know what else to try. I have tried different methods to get the envvars, trying different versions, updating my plugins and can't seem to find anything like this on the internet.
sbt-dotenv sets environment variables at build time, while the Lightbend config library resolves them at runtime. If you want the variables in application.conf to be substituted at compile time, you can:
write an sbt task that will load application.conf, resolve the variables and create a new config file containing the actual values
customize the mappings task so that the original application.conf is not included in the final image and your new, generated config file is used instead
But frankly, this whole thing just smells like a bad idea. Why don't you just use the --env flag for docker run and set the environment variables there?

Google Spread Sheet Spark library

I am using https://github.com/potix2/spark-google-spreadsheets library for reading the spread sheet file in spark. It is working perfectly in my local.
val df = sqlContext.read.
format("com.github.potix2.spark.google.spreadsheets").
option("serviceAccountId", "xxxxxx#developer.gserviceaccount.com").
option("credentialPath", "/path/to/credentail.p12").
load("<spreadsheetId>/worksheet1")
I created a new assembly jar with included all the credentials and use that jar for reading the file. But I am facing issue with reading the credentialPath file. I tried using
getClass.getResourceAsStream("/resources/Aircraft/allAircraft.txt")
But library only supports absolute path. Please help me to resolve this issue.
You can use --files argument of spark-submit or SparkContext.addFile() to distribute a credential file. If you want to get a local path of the credential file in worker node, you should call SparkFiles.get("credential filename").
import org.apache.spark.SparkFiles
// you can also use `spark-submit --files=credential.p12`
sqlContext.sparkContext.addFile("credential.p12")
val credentialPath = SparkFiles.get("credential.p12")
val df = sqlContext.read.
format("com.github.potix2.spark.google.spreadsheets").
option("serviceAccountId", "xxxxxx#developer.gserviceaccount.com").
option("credentialPath", credentialPath).
load("<spreadsheetId>/worksheet1")
Use SBT and try typesafe config library.
Here is a simple but complete sample which reads some information from the config file placed in resources folder.
Then you can assemble a jar file using sbt-assembly plugin.
If you're working in the Databricks environment, you can upload the credentials file.
Setting the GOOGLE_APPLICATION_CREDENTIALS environment variable, as described here, does not get you around this requirement because it's a link to the file path, not the actual credentials. See here for more details about getting the right credentials and using the library.

sbt run - how to specify working directory? [duplicate]

I would like to be able to run the java program in a specific directory. I think, that it is quite convenient to parametrize working directory, because it allows to easily manage configurations.
For example in one folder you could have configuration for test, in other you could have resources needed for production. You probably think, that there is option to manipulate classpath for including/exluding resources but such solution works only if you are interested in resources stored in classpath and referencing them using Classloader.getResource(r). But what if you have some external configuration and you want to access it using simple instructions like File file = new File("app.properties");?
Let's see ordinary example.
Your application uses app.properties file, where you store credentials for external service. Application looks for this file in working directory, because you uses mentioned File file = new File("app.properties"); instruction to access it. In your tests you want to use app.properties specific to your tests. In you integration tests you want to use app.properties specific to another environment. And finally when you build and release application you want to provide other app.properties file. All these resources you want to access always in the same way just by typing File file = new File("app.properties"); instead of (pseudo code):
if(configTest)
file = File("testWorkDir/app.properties");
else if(config2)
file = File("config2WorkDir/app.properties");
else
file = File("app.properties");
or instead of using resources in classpath
this.getClass.getClassLoader.getResource("app.properties");
Of course you are clever programmer and you use build tool such maven, gradle or sbt :)
Enought at the outset. At least The Question:
Is there a way to set working directory in java and if yes how to configure it in build tools (especially in sbt)?
Additional info:
changing 'user.dir' system property is not working (I've tried to change it programaticly).
In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
Providing environment variable like YOUR_APP_HOME and referencing resources from this path is feasible but require to remember about this in your code.
In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
I'm not sure what the above statement means, but with sbt you need to fork to change your working directory during the run or test. This is documented in Enable forking and Change working directory.
If you fork, you can control everything, including the working directory.
http://www.scala-sbt.org/0.13.5/docs/Detailed-Topics/Forking.html
Example code:
fork in run := true
baseDirectory in run := file("/path/to/working/directory/")

How do I specify a config file with play 2.4 and activator

I am building a Scala Play 2.4 application which uses the typesafe activator.
I would like to run my tests 2 times with a different configuration file for each run.
How can I specify alternative config files, or override the config settings?
I currently run tests with the command "./activator test"
You can create different configuration files for different environments/purposes. For example, I have three configuration files for local testing, alpha deployment, and production deployment as in this project https://github.com/luongbalinh/play-mongo
You can specify the configuration for running as follows:
activator run -Dconfig.resource=application.conf
where application.conf is the configuration you want to use.
You can create different configuration files for different environments. To specify the configuration to use it with activator run, use the following command:
activator "run -Dconfig.resource=application.conf"
where the application.conf is the desired configuration. Without the quotes it did not work for me. This is using the same configuration parameters as you use when going into production mode as described here:
https://www.playframework.com/documentation/2.5.x/ProductionConfiguration#Specifying-an-alternate-configuration-file
Important to know is also that config.resource tries to locate the configuration within the conf/ folder, so no need to specify that as well. For full paths not among the resources, use config.file. Further reading is also in the above link.
The quotes need to be used because you do not want to send the -D to activator, but to the run command. Using the quotes, the activator's JVM gets no -D argument but it interprets "run -Dconfig.file=application.conf" and sets the config.file property accordingly, also in the activator's JVM.
This was already discussed here: Activator : Play Framework 2.3.x : run vs. start
Since all the above are partially incorrect, here is my hard wrought knowledge from the last weekend.
Use include "application.conf" not include "application" (which Akka does)
Configs must be named .conf or Play will discard them silently
You probably want -Dconfig.file=<file>.conf so you're not classpath dependent
Make sure your provide the full file path (e.g. /opt/configs/prod.conf)
Example
Here is an example of this we run:
#prod.conf
include "application"
akka.remote.hostname = "prod.blah.com"
# Example of passing in S3 keys
s3.awsAccessKeyId="YOUR_KEY"
s3.awsSecretAccessKey="YOUR_SECRET_KEY"
And just pass it in like so:
activator -Dconfig.file=/var/lib/jenkins/jenkins.conf test
of if you fancy SBT:
sbt -Dconfig.file=/var/lib/jenkins/jenkins.conf test
Dev Environment
Also note it's easy to make a developer.conf file as well, to keep all your passwords/local ports, and then set a .gitignore so dev's don't accidentally check them in.
The below command works with Play 2.5
$ activator -Dconfig.resource=jenkins.conf run
https://www.playframework.com/documentation/2.5.x/ProductionConfiguration

Environment specific config in Play framework application

I've had a look around but it's not very clear to me how I can configure a set of environment specific variables for my Play framework application.
As an example, I would like to use an in memory database like h2 for local development but when I move to production or my pre-production environment I would like to be connecting to a postgres database.
How do I configure my app so that it will use the variables relevant to the environment it is being deployed to? This is a Scala Play app.
One option (as documented in the excellent play docs), is to specify conf files during app startup.
Using -Dconfig.resource will search for an alternative configuration file in the application classpath (you usually provide these alternative configuration files into your application conf/ directory before packaging). Play will look into conf/ so you don’t have to add conf/.
$ /path/to/bin/<project-name> -Dconfig.resource=prod.conf
Using -Dconfig.file you can specify an environment specific configuration file not packaged into the application artifacts:
$ start -Dconfig.file=/opt/conf/prod.conf
Using -Dconfig.url you can also specify a configuration file to be loaded from any URL:
$ start -Dconfig.url=http://conf.mycompany.com/conf/prod.conf
Note that you can always reference the original configuration file in a new prod.conf file using the include directive, such as:
include "application.conf"
key.to.override=blah
You can have Puppet or some similar tools to generate the needed parameters in environment.conf and place in a dedicated directory.
Then in application.conf, at the end of the file, have this:
include "file:///[your directory...]/environment.conf"
to override any testing or local values (e.g.DB parameteres) listed above
You can use different configuration files by overriding onLoadConfig method of the Global object like:
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, classloader: ClassLoader, mode: Mode.Mode): Configuration = {
val fileName = s"application.${mode.toString.toLowerCase}.conf"
config ++ Configuration(ConfigFactory.load(fileName))
}
}
This way you have 'application.test.conf' for test mode and 'application.dev.conf' for develop mode whereas you can use another config file in production via '-Dconfig.file' parameter.