I update a key value in application.conf by setting the environment variable
play.http.secret.key=${?MY_SECRET_KEY}
But it still loads the previous value.
After a system reboot the change takes effect.
Is there a way to refresh the config file without rebooting?
Try the following:
Given in a file called sample1.conf:
a {
b {
c = 30
d = ["Red", "Orange", "Green", "Blue"]
}
}
If you wish to change a property be sure to change it as system property first and call invalidate caches then load again. This is also what allows you to override on the command line.
System.setProperty("a.b.c", "100")
ConfigFactory.invalidateCaches()
val config = ConfigFactory.load("sample1")
config.getDouble("a.b.c") should be (100.0)
Don't know if this would work in all scenarios. So it may or may not work with your application of choice.
AFAIK, there is no Out of the box mechanism to do such thing.
As it turns out, when you start Play through the "sbt run" command, it starts in Dev mode, creating two ClassLoaders, one for your code, and the other one for the immutable code, like libraries, dependencies, and the framework code itself. This is done this way as to provide the hot deploy feature, killing and reloading the first CL.
Because the application.conf is loaded by Play on start up, I would think that it is loaded within the fixed ClassLoader, hence no reload is possible.
Related
I have an issue related to vardeps where I am making a task dependent on some variables.
I have created some new variables e.g., NEW_VARIABLE, added them to BB_ENV_EXTRAWHITE. In some recipes, I wrote my own implementation of some tasks that are dependent on these new variables, and for this dependency to work I added e.g., do_install[vardeps] = "NEW_VARIABLE", so I am now expecting that every time I change this NEW_VARIABLE and perform e.g., bitbake recipename, the do_install task should run. I checked the task signature and I see the NEW_VARIABLE there.
Let's assum I have two possible values for this variable. When I set the variable for the first time "value1", i.e., the first build, everything works and there is no problem. When I change its value to the other value "value2" not used before and build the recipe again, the do_install will also run and no problem occurs. The problem is however, if I set the variable again to the old value "value1", and I execite bitbake recipename again. The do_install will not be re-triggered, and this leads to some wrong/old data located in work directory, and also produced in the image.
I tried setting BB_DONT_CACHE, as I understood in an old question that the problem might be that the recipe needs to be parsed again, however this did not work at all.
I do not want to always run the tasks when I perform a new build, i.e., do_install[[nostamp] = "1" so this solution can not be regarded. I just want it to run again every time I change this NEW_VARIABLE.
Is what I am expecting a normal behavior? Or Yocto does not work this way?
I faced same issue all day, also tried vardeps BB_DONT_CACHE etc but then I changed into using SRC_URI but still got same behavior. Like you I build a separate recipe "bitbake name" and look in the work image/ for the recipe output. For me it turns out that I just think that this will be updated just as if I make other new changes. But once I change input to something old the state cache kicks in and the work dir is left alone - fooling me to think it does not work properly. But when I build the final image-base it is properly populated. I guess one need to force build or disregard compiler cache to get the output this way.
I have this in a .conf file and I want to overwrite the value at index 0 in array field1
database {
master {
field1:["a","b","c"]
}
}
and I run the application via sbt like this:
sbt -Ddatabase.master.field1.0="11.111.11.111:3306" package
then I look inside the jar at the .conf file and nothing is changed.
This guide indicates to change each array element by index instead of the whole array (which I also tried but to no avail): https://salsa.debian.org/java-team/typesafe-config/blob/master/HOCON.md#array-and-object-concatenation
How do you overwrite array elements in HOCONS?
I think the problem is, that your hocon is part of what you try to pack, but the -D will give the params to the JVM of sbt. Why should the config of the sbt's JVM have any influence to the .jar you pack?
Edit
Adrian taught me, that this is actually possible. Still my solution below is what I would prefer. It is explicit and good to understand. Some params and the sbt call seems to me to not be nice and clean.
I guess you want to have an environment specific database config.
You could start the application with your config as you tried with sbt or put all configs for different systems in different hocons and load the hocons depending on the system you start, which you can define by a parameter for the program.
Look at the docs to see how to load additional files.
I added a custom configuration to my plugin
Configuration customCompile = project.configurations.create("customCompile")
.setVisible(false).setTransitive(true)
I want to do something like
configuration.compile.addExtendsFrom(customCompile)
So that in my plugin, I can isolate certain dependencies to add to the classpath of something I'm running (with `project.configurations.customCompile). I want them to remain on the regular compile path as well.
What I did was this :
Configuration compile = project.configurations.getByName('compile')
Set updated = WrapUtil.asSet(compile.getExtendsFrom()) // returns a immutable set
updated.add(customCompile)
compile.setExtendsFrom(updated)
It works, but it feels a little convoluted, extendsFrom seems to have the opposite meaning of inheritance that I'm used to with java classes. Is there a better way to be doing this?
a.extendsFrom(b) is analogue to "a inherits from b", and you can simply do configurations.compile.extendsFrom(customCompile). (Not addExtendsFrom or getExtendsFrom.)
In my project build definition the SettingKey useProguard in the Android scope is set to true. This is what I want by default. When I execute one particular task, however, I want useProguard to be false. Everything in the Android scope comes from the sbt-android-plugin.
I'm not sure how best to solve this problem. From what I read it seems like a command can get the job done, since it can execute a task with a different state than what your current session sees. I tried to create such a command like so:
def buildWithoutProguard = Command.command("build-without-proguard") { state =>
val extracted = Project.extract(state)
import extracted._
val transformed = session.mergeSettings :+ (useProguard in Android := false)
val newStructure = Load.reapply(transformed, structure)
val newState = Project.setProject(session, newStructure, state)
Project.evaluateTask(buildAndRun, newState)
state
}
I'm appending the command to my project settings and running the 'build-without-proguard' command executes the buildAndRun task as desired. However, useProguard is still true instead of false as I would expect.
First, this whole approach feels heavy handed to me. Assuming changing sbt-android-plugin isn't an option here then how else would I solve this problem?
Second, why doesn't this approach work as is?
From what I understand from your question, you want the setting to be different for a dependency depending on what is depending on it. This doesn't make sense -- a dependency either is satisfied or it isn't, and what depends on it doesn't come into the equation.
Your solution seems satisfactory to me. An alternative would be making two projects, pointing to the same source, but with different proguard settings and different target, so one would build with and the other without proguard, and both would keep their state. You'd then do whatever you want just switching the projects.
This seems like a simple thing, but I can't find an answer in the existing questions:
How do you add a global argument to all your present and existing run or debug configurations? In my case, I need a VM argument, but I see that this could be useful for runline arguments as well.
Basically, every time I create a unit test I need to create a configuration (or run, which creates one), and then manually edit each one with the same VM argument. This seems silly for such a good tool.
This is not true. You can add the VM arguments to the JRE definition. This is exactly what it is for. I use it myself so that assertions are enabled and heap is 1024mb on every run, even future ones.
Ouch: 7-years bug, asking for running configuration template, precisely for that kind or reason.
This thread proposes an interesting workaround, based on duplicating a fake configuration based on string substitution:
You can define variables in Window->Preferences->Run/Debug->String Substitution. For example you can define a projectName_log4j variable with the
correct -Dlog4j.configuration=... value.
In a run configuration you can use ${projectName_log4j} and you don't have to remember the real value.
You can define a project-specific "empty" run configuration.
Set the project and the arguments fields in this configuration but not the main class. If you have to create a new run configuration for this project select this one and use 'Duplicate' from its popup-menu to copy this configuration.
You have to simply set the main class and the program arguments.
Also you can combine both solutions: use a variable and define an "empty"
run configuration which use this variable. The great advantage in this case
is when you begin to use a different log4j config file you have to change
only the variable declaration.
Not ideal, but it may alleviate your process.