I'm running SBT using my specially built Scala. My built Scala Compiler will do a lot of things at runtime, with a lot of function calls, which can be recursive.
So when I run SBT using my built Scala Compiler, stack overflows after a long time. I try to set -J-Xss when starting SBT. But that doesn't work.
I encountered the problem with SBT heap size before. And many posts says setting -J-Xmx when starting SBT won't change the JVM heap size because it is overridden by the default SBT memory options.
How to set heap size for sbt?
Now, I wonder whether -J-Xss can be overridden by default SBT options, just like -J-Xmx being overridden. Or I should simply try to set -J-Xss larger?
There are a number of ways to do this, but it depends what you are trying to achieve. If you want larger heap for running tests for instance, look at the secondary approach undertaken here.
SBT_OPTS
First you can simply set the environment variable SBT_OPTS which SBT will natively look for while loading itself, and this should override any settings that you want to specify.
export SBT_OPTS = "-Xmx1G;-Xms256m;...";
Custom launcher
The other way to achieve the same is to basically create a custom SBT launching script. Have a look at the example here.
For testing
If you want to modify the testing options, you need to use javaOptions in ThisBuild ++= Seq("-Xmx1g", ...). For them to even be run, you always need to have fork in Test := true, which will create a forked JVM for running tests. Without that, the options specified will not be honoured.
Related
Every time I use sbt the first thing I do is set the log level to Error:
$ sbt
// ... sbt loads
[my_project] $ error
[my_project] $
Several places on SO and elsewhere recommend adding this to either your build.sbt or your sbt.boot.properties:
set logLevel in run := Level.Error
But I'm working in a shared project with many developers and I don't want to change the log level for everyone, just me! I do currently use SBT_OPTS to tailor sbt's memory usage on my machine, and this may potentially be an option but I can't find any guidance on what format to pass options via SBT_OPTS except for Java things like pass -Dkey=val directly to the java runtime and memory parameters like -Xmx8G.
sbt --help indicates that .sbtopts may also be a potential option:
.sbtopts if this file exists in the current directory, its contents
are prepended to the runner args
But as far as I can tell there are no method to specify command-line "runner args" that set the log level to Error, only for setting the log level to debug via --debug.
I'm a little stumped, I've identified at least two potential avenues (SBT_OPTS and .sbtopts) for passing machine-specific customization to sbt, but do either of these support setting the log level to Error? Or is there a third avenue I'm missing, maybe some elusive ~/.sbt, that I could use to set my machine's sbt log level to Error?
Put the following in $HOME/.sbt/1.0/global.sbt
logLevel := Level.Error
All available log level options are:
Error
Warn
Info
Debug
Thanks #maxkar for leading me to this solution
To speed up our development workflow we split the tests and run each part on multiple agents in parallel. However, compiling test sources seem to take most of the time for the testing steps.
To avoid this, we pre-compile the tests using sbt test:compile and build a docker image with compiled targets.
Later, this image is used in each agent to run the tests. However, it seems to recompile the tests and application sources even though the compiled classes exists.
Is there a way to make sbt use existing compiled targets?
Update: To give more context
The question strictly relates to scala and sbt (hence the sbt tag).
Our CI process is broken down in to multiple phases. Its roughly something like this.
stage 1: Use SBT to compile Scala project into java bitecode using sbt compile We compile the test sources in the same test using sbt test:compile The targes are bundled in a docker image and pushed to the remote repository,
stage 2: We use multiple agents to split and run tests in parallel.
The tests run from the built docker image, so the environment is the
same. However, running sbt test causes the project to recompile even
through the compiled bitecode exists.
To make this clear, I basically want to compile on one machine and run the compiled test sources in another without re-compiling
Update
I don't think https://stackoverflow.com/a/37440714/8261 is the same problem because unlike it, I don't mount volumes or build on the host machine. Everything is compiled and run within docker but in two build stages. The file modified times and paths are retained the same because of this.
The debug output has something like this
Initial source changes:
removed:Set()
added: Set()
modified: Set()
Invalidated products: Set(/app/target/scala-2.12/classes/Class1.class, /app/target/scala-2.12/classes/graph/Class2.class, ...)
External API changes: API Changes: Set()
Modified binary dependencies: Set()
Initial directly invalidated classes: Set()
Sources indirectly invalidated by:
product: Set(/app/Class4.scala, /app/Class5.scala, ...)
binary dep: Set()
external source: Set()
All initially invalidated classes: Set()
All initially invalidated sources:Set(/app/Class4.scala, /app/Class5.scala, ...)
Recompiling all 304 sources: invalidated sources (266) exceeded 50.0% of all sources
Compiling 302 Scala sources and 2 Java sources to /app/target/scala-2.12/classes ...
It has no Initial source changes, but products are invalidated.
Update: Minimal project to reproduce
I created a minimal sbt project to reproduce the issue.
https://github.com/pulasthibandara/sbt-docker-recomplile
As you can see, nothing changes between the build stages, other than running in the second stage in a new step (new container).
While https://stackoverflow.com/a/37440714/8261 pointed at the right direction, the underlying issue and the solution for this was different.
Issue
SBT seems to recompile everything when it's run on different stages of a docker build. This is because docker compresses images created in each stage, which strips out the millisecond portion of the lastModifiedDate from sources.
SBT depends on lastModifiedDate when determining if sources have changed, and since its different (the milliseconds part) the build triggers a full recompilation.
Solution
Java 8:
Setting -Dsbt.io.jdktimestamps=true when running SBT as recommended in https://github.com/sbt/sbt/issues/4168#issuecomment-417655678 to workaround this issue.
Newer:
Follow recomendation in https://github.com/sbt/sbt/issues/4168#issuecomment-417658294
I solved the issue by setting SBT_OPTS env variable in the docker file like
ENV SBT_OPTS="${SBT_OPTS} -Dsbt.io.jdktimestamps=true"
The test project has been updated with this workaround.
Using SBT:
I think there is already an answer to this here: https://stackoverflow.com/a/37440714/8261
It looks tricky to get exactly right. Good luck!
Avoiding SBT:
If the above approach is too difficult (i.e. getting sbt test to consider that your test classes do not need re-compiling), you could instead avoid using sbt but instead run your test suite using java directly.
If you can get sbt to log the java command that it is using to run your test suite (e.g. using debug logging), then you could run that command on your test runner agents directly, which would completely preclude sbt re-compiling things.
(You might need to write the java command into a script file, if the classpath is too long to pass as a command-line argument in your shell. I have previously had to do that for a large project.)
This would be a much hackier approach that the one above, but might be quicker to get working.
A possible solution might be defining your own sbt task without dependencies or try to change the test task. For example you could create a task to run a JUnit runner if that was your testing framework. To define a task see this on Implementing Tasks.
You could even go as far as compiling sending the code and running the remotes from the same task as it is any scala code you want. From the sbt reference manual
You could be defining your own task, or you could be planning to redefine an existing task. Either way looks the same; use := to associate some code with the task key
So the problem is really simple and I hope solution will be as well.
So basically I have two configuration files application.conf and dev.conf. I'm passing my config files from command line like that sbt -Dconfig.file=dev.conf.
The problem is when I use ConfigFactory.load from main object(the one which extends App) it loads config I passed via command line(in this case dev.conf), but when I load the config from different object it loads default application.conf.
Can I load somehow config passed from arguments from any object?
When you run your application with the runMain SBT task, then by default SBT won't create a separate JVM for your code. This has several consequences around the application lifecycle, and of course with regard to system properties as well.
In general, your approach should work, as long as your build configuration does not enable forking. However, I think the better approach would be to actually rely on forking and specify the system property explicitly. This is guaranteed to work. To do this, you need to set the fork setting in the run task to true, and then add a JVM command line option:
Compile / run / fork := true,
Compile / run / javaOptions += "-Dconfig.file=dev.conf",
Don't forget to restart SBT after that. You won't need to pass the config.file property to SBT with this approach; rather, it is controlled by the javaOptions setting, as in the example above.
I have a custom task defined in build.sbt:
val doSmth = taskKey[Unit]("smth")
doSmth := {
version := "1.0-SNAPSHOT"
}
But it does not change version. What I actually wanted is to have custom sbt publish task which will publish always the same version to repo I added. Beside that, normal sbt assembly process is using incremental version scheme. So I wanted to make task which will set version, execute sbt publish and then return version to previous value. I've done everything but am blocked in changing version. Is there any way to change value of sbt settings (since they are vals) from build.sbt?
Is this possible at all? I guess I could also copy code from sbt publish command (as someone mentioned on one topic) but that's the worst solution in my opinion.
Thanks
I found one possible solution, by changing version for sbt publish task, but it is really strange and unintuitive in SBT. For example, I tried
version := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in publish := "SNAPSHOT-1.0"
I also tried defining different version in Test and Compile configuration with:
version in Compile := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in Test: = "SNAPSHOT-1.0
but I could not get it to work. SBT behaves really strange. Is it possible at all to use different value of some setting in one task than it's value in all other tasks?
version is a Setting in sbt. Probably the distinction between settings and tasks is the number one thing you need to understand in sbt, else you are going to have really hard time using the tool.
Settings are immutable and initialized when sbt starts (more or less). After that, they cannot change. Tasks on the other hand, are like functions. Everytime you call them, they get re-evaluated.
You see now that it's impossible to have a task mutate a setting. It just doesn't make sense in sbt.
What you can do though, is overshadow a setting in the context of task. This is exactly what you did with version in publish := "SNAPSHOT-1.0". I don't think there is any other better way to do this.
I am trying to load different config file per task in sbt, for example I want to be able to run a task like:
sbt devRun
and loads src/main/resources/application.dev.conf while running:
sbt run
will load src/main/resources/application.conf.
As I understand it, we can load different application.conf when running test by putting it on src/test/resource/application.conf, but using different config file for different task (or scope) will need some code on SBT.
I've been trying to google around and usually the suggestion is to use it on run task like:
$ sbt run -Dconfig.resources="application.dev.conf"
But as far as I understand, the above will only load different config on runtime. I have multiple projects with lots of config file. I want to load them dynamically, based on scope / task. What's the best way to do this?
Thanks before.
You can(must) use the javaOptions setting. It represents Java options when sbt runs a forked JVM, e.g. it runs tests in a different JVM process(by default).
javaOptions in Test += "-Dconfig.resource=" + System.getProperty("config.resource", "application.test.conf")
Instead of Test plug in whatever config you have. If you don't explicitly pass a config.resource parameter it will use application.test.conf.