sbt run - how to specify working directory? [duplicate] - scala

I would like to be able to run the java program in a specific directory. I think, that it is quite convenient to parametrize working directory, because it allows to easily manage configurations.
For example in one folder you could have configuration for test, in other you could have resources needed for production. You probably think, that there is option to manipulate classpath for including/exluding resources but such solution works only if you are interested in resources stored in classpath and referencing them using Classloader.getResource(r). But what if you have some external configuration and you want to access it using simple instructions like File file = new File("app.properties");?
Let's see ordinary example.
Your application uses app.properties file, where you store credentials for external service. Application looks for this file in working directory, because you uses mentioned File file = new File("app.properties"); instruction to access it. In your tests you want to use app.properties specific to your tests. In you integration tests you want to use app.properties specific to another environment. And finally when you build and release application you want to provide other app.properties file. All these resources you want to access always in the same way just by typing File file = new File("app.properties"); instead of (pseudo code):
if(configTest)
file = File("testWorkDir/app.properties");
else if(config2)
file = File("config2WorkDir/app.properties");
else
file = File("app.properties");
or instead of using resources in classpath
this.getClass.getClassLoader.getResource("app.properties");
Of course you are clever programmer and you use build tool such maven, gradle or sbt :)
Enought at the outset. At least The Question:
Is there a way to set working directory in java and if yes how to configure it in build tools (especially in sbt)?
Additional info:
changing 'user.dir' system property is not working (I've tried to change it programaticly).
In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
Providing environment variable like YOUR_APP_HOME and referencing resources from this path is feasible but require to remember about this in your code.

In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
I'm not sure what the above statement means, but with sbt you need to fork to change your working directory during the run or test. This is documented in Enable forking and Change working directory.

If you fork, you can control everything, including the working directory.
http://www.scala-sbt.org/0.13.5/docs/Detailed-Topics/Forking.html
Example code:
fork in run := true
baseDirectory in run := file("/path/to/working/directory/")

Related

build.sbt and application.conf are not reading environment variables set by .env

I'm attepmting to build a container of a Play Framework application. For this I'm using the sbt-native-packager plugin and using the command 'sbt clean docker:publishLocal'.
To simplify future pipelines and the dev environments, I enable the sbt-dotenv plugin to use a .env file where I can define my variables. The variables are being used in build.sbt and application.conf
build.sbt
name := sys.env.get("APP_NAME")
organization := sys.props.get("CLIENT_NAME")
version := sys.props.get("APP_VSN")
application.conf uses them as follows, to allow for defaults
default.username = username
default.username = ${?DB_USERNAME}
default.password = password
default.password = ${?DB_PASSWD}
When running sbt clean docker:publishLocal, I can see the plugin loading the variables but the configuration files can never find them and always fallback to defaults. Since my understanding of Docker is a bit limited, I thought that maybe I needed to pass the envVars to Docker as follows:
dockerEnvVars := Map("APP_NAME" -> sys.env.get("APP_NAME").....
But this is not working either, they are present in the final container but I need them in the building process so the configuration files can build the distributable with them.
I don't really know what else to try. I have tried different methods to get the envvars, trying different versions, updating my plugins and can't seem to find anything like this on the internet.
sbt-dotenv sets environment variables at build time, while the Lightbend config library resolves them at runtime. If you want the variables in application.conf to be substituted at compile time, you can:
write an sbt task that will load application.conf, resolve the variables and create a new config file containing the actual values
customize the mappings task so that the original application.conf is not included in the final image and your new, generated config file is used instead
But frankly, this whole thing just smells like a bad idea. Why don't you just use the --env flag for docker run and set the environment variables there?

How to add a folder and its content to the standard paths of Playframework and Heroku?

I have a Scala Play framework 2.7.x application which I deploy in Heroku. I use Lucene to index the WebApp and since there is no JdbcDirectory in Lucene I need to use their FSDirectory instead and that leads to issues with Heroku because I can't generate the index files under $APP_HOME/lucene-index/* in Heroku otherwise it will be wiped out each time. This leads me to two possible solutions and this is the simpler one:
Generate the $APP_HOME/lucene-index locally before deployment and save it in GIT, this folder will be at the same level as $APP_HOME/app and $APP_HOME/public.
Integrate the new nonstandard Play folder $APP_HOME/lucene-index so that it gets copied by Heroku (the purpose of this OP).
Upon startup the application checks for this folder and if doesn't exist (local case) gets generated otherwise it opens it (Heroku case).
Do I need to do something special on #2 to have Heroku recognize $APP_HOME/lucene-index/ as a folder that needs to be packaged together with the application? e.g. I would not like to put the $APP_HOME/lucene-index/ under $APP_HOME/conf/ for this to work.
Here I find the Anatomy of a Play 2.7.x application but there is no word on how to add extra path folders to it.
The solution I was after was to include the ./lucene-index folder as part of the Play dist. This is accomplished by changing the build.sbt file adding:
//********************************************************
// Add lucene-index to the dist
//********************************************************
import com.typesafe.sbt.packager.MappingsHelper._
mappings in Universal ++= directory(baseDirectory.value / "lucene-index")
Now it deploys to Heroku and it all works nicely.

How do I specify a config file with play 2.4 and activator

I am building a Scala Play 2.4 application which uses the typesafe activator.
I would like to run my tests 2 times with a different configuration file for each run.
How can I specify alternative config files, or override the config settings?
I currently run tests with the command "./activator test"
You can create different configuration files for different environments/purposes. For example, I have three configuration files for local testing, alpha deployment, and production deployment as in this project https://github.com/luongbalinh/play-mongo
You can specify the configuration for running as follows:
activator run -Dconfig.resource=application.conf
where application.conf is the configuration you want to use.
You can create different configuration files for different environments. To specify the configuration to use it with activator run, use the following command:
activator "run -Dconfig.resource=application.conf"
where the application.conf is the desired configuration. Without the quotes it did not work for me. This is using the same configuration parameters as you use when going into production mode as described here:
https://www.playframework.com/documentation/2.5.x/ProductionConfiguration#Specifying-an-alternate-configuration-file
Important to know is also that config.resource tries to locate the configuration within the conf/ folder, so no need to specify that as well. For full paths not among the resources, use config.file. Further reading is also in the above link.
The quotes need to be used because you do not want to send the -D to activator, but to the run command. Using the quotes, the activator's JVM gets no -D argument but it interprets "run -Dconfig.file=application.conf" and sets the config.file property accordingly, also in the activator's JVM.
This was already discussed here: Activator : Play Framework 2.3.x : run vs. start
Since all the above are partially incorrect, here is my hard wrought knowledge from the last weekend.
Use include "application.conf" not include "application" (which Akka does)
Configs must be named .conf or Play will discard them silently
You probably want -Dconfig.file=<file>.conf so you're not classpath dependent
Make sure your provide the full file path (e.g. /opt/configs/prod.conf)
Example
Here is an example of this we run:
#prod.conf
include "application"
akka.remote.hostname = "prod.blah.com"
# Example of passing in S3 keys
s3.awsAccessKeyId="YOUR_KEY"
s3.awsSecretAccessKey="YOUR_SECRET_KEY"
And just pass it in like so:
activator -Dconfig.file=/var/lib/jenkins/jenkins.conf test
of if you fancy SBT:
sbt -Dconfig.file=/var/lib/jenkins/jenkins.conf test
Dev Environment
Also note it's easy to make a developer.conf file as well, to keep all your passwords/local ports, and then set a .gitignore so dev's don't accidentally check them in.
The below command works with Play 2.5
$ activator -Dconfig.resource=jenkins.conf run
https://www.playframework.com/documentation/2.5.x/ProductionConfiguration

environmental variable substitution when including another configuration file in Play 2.2

Is it possible to use environmental variable substitution when including another configuration file?
I would like to have something like that:
include "${HOME}/.foo/credentials.conf"
Configuration documentation mentions locating resources and include substitution but not together.
This works:
include "/home/me/.foo/credentials.conf"
and my HOME is correctly set.
But all attempts to make include "${HOME}/.foo/credentials.conf" so far failed
Background:
I deliberately want to keep credentials and other sensitive data out of our code base but have them available for local dev environments for testing. I am aware of more sophisticated solutions using external storage like hinted here Playframework 2 - Storing your credentials and we use something similar for live and preview environments but these are not suitable for local dev setup.
An alternative is to include credentials file to code base after all but use git ignore to prevent pushing it, but it is fragile solution and risk is someone will eventually push it and compromise credentials.
TBH I'm not even able to include file with absolute path /home/me... anyway approach which will work for you is just using alternative conf file as described in the same doc:
In file /home/me/.foo/credentials.conf you need to include application.conf - Play will fallback it to the file in classpath (this which is under VCS):
include "application.conf"
myCredentials.user="Espinosa"
myCredentials.password="fooBar123"
then run/start your app with this config file locally:
play -Dconfig.file=${HOME}/.foo/credentials.conf ~run
and that's it.
Note: of course it's easier to setup this addition in your IDE (i.e. IntelliJ: Run > Edit configurations) or write a shell script containing this command

Play Framework - How to maintain configuration files for different environments?

For my Play 2.2/Scala application (built with SBT), I would like to deploy different configuration files depending on the environment I'm deploying to (e.g. to couple a deployment with a particular database server). How does one create different variants of the application's configuration file (conf/application.conf) for different deployment targets? Hopefully variants can be generated from a base version?
What I'm used to from .NET is to have a base configuration file (Web.config), which undergoes a certain transformation depending on the profile one is deploying (e.g. Production). Does one use a similar technique in the Play/Scala world?
Alternative configuration files are covered in Play's documentation quite well in section Specifying alternative configuration file.
In short - in application.conf you place default configuration of your app, and additionally you need to create additional files for you environment(s) ie. life.conf, dev.conf etc. In these files you first need to include application.conf (which will read whole default configuration) and next just overwrite only parts which have to be changed - ie. DB credentials, it could be dev.conf:
include "application.conf"
db.default.driver=org.h2.Driver
db.default.url="jdbc:h2:mem:alternative-database-for-dev-testing"
db.default.user=developer
db.default.password="developerpass"
So finally you start your application (after dist) as
./start -Dconfig.resource=dev.conf
or with the Play console
play -Dconfig.resource=dev.conf run
Several tips:
It's good idea to do not place your 'life' DB credentials in default application.conf file, if some dev will forget to include his dev.conf he won't damage the production DB, instead you should put it in prod.conf.
Also these additional configs shouldn't be placed in any VCS (ie. git) repository - creating them directly on target machine (and ignoring in repository) give you sure, that people who shouldn't know the life database credentials won't see it.
It's also possible to use remote alternative config file, which can be useful ie. when you deploying several instances of the same app ie. on several hosts in the cloud.
Each dev can has own config file ie dev_aknuds1.conf, dev_biesior.conf etc, so you can ignore them with one pattern dev_*.conf in repo.
Finally you can just create a shell script (unix) or bat file (Windows) to start using choosen config file like start_dev.sh, run_dev.sh etc. so you won't need to write -Dconfig.resource=... each time