Change version in build.sbt custom task - scala

I have a custom task defined in build.sbt:
val doSmth = taskKey[Unit]("smth")
doSmth := {
version := "1.0-SNAPSHOT"
}
But it does not change version. What I actually wanted is to have custom sbt publish task which will publish always the same version to repo I added. Beside that, normal sbt assembly process is using incremental version scheme. So I wanted to make task which will set version, execute sbt publish and then return version to previous value. I've done everything but am blocked in changing version. Is there any way to change value of sbt settings (since they are vals) from build.sbt?
Is this possible at all? I guess I could also copy code from sbt publish command (as someone mentioned on one topic) but that's the worst solution in my opinion.
Thanks

I found one possible solution, by changing version for sbt publish task, but it is really strange and unintuitive in SBT. For example, I tried
version := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in publish := "SNAPSHOT-1.0"
I also tried defining different version in Test and Compile configuration with:
version in Compile := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in Test: = "SNAPSHOT-1.0
but I could not get it to work. SBT behaves really strange. Is it possible at all to use different value of some setting in one task than it's value in all other tasks?

version is a Setting in sbt. Probably the distinction between settings and tasks is the number one thing you need to understand in sbt, else you are going to have really hard time using the tool.
Settings are immutable and initialized when sbt starts (more or less). After that, they cannot change. Tasks on the other hand, are like functions. Everytime you call them, they get re-evaluated.
You see now that it's impossible to have a task mutate a setting. It just doesn't make sense in sbt.
What you can do though, is overshadow a setting in the context of task. This is exactly what you did with version in publish := "SNAPSHOT-1.0". I don't think there is any other better way to do this.

Related

Is there "any" possible way to Sync a local Scala library with our current project in order to have some sort of "hot reloading" of it?

I've been creating some projects in Scala, and there might be already several components that I constantly use , reuse or implement in different ways, I want to start putting all that stuff in some sort of a library but the problem is that I really want to have the chance to check its implementation while working, like the nice "Hot reloading" that the revolver plugin brings whenever we require to see the changes of our code in the console
For now its clear that whenever I want to publish something setup my local build.sbt file and publish it
sbt publishLocal
And then bring them as
"eu.myproject" %% "my-lib" % "1.0.0"
But I really would appreciate a way to work with this libraries with some real time sync in order to see the changes without having to publish them for each change
UPDATE
So thanks to Matthias Berndt I manage to update a project , with some nice hot reload still with revolver by configuring the sbt file as
lazy val root = Project ....
.dependsOn(
ProjectRef(file("/HOME/my-lib"), "my-lib"))
I still will research a nice pattern to bring some more local and published libraries in order to have them in dev and prod
You can use a ProjectRef to add the library as a subproject to the build system of the program that uses the library.
Check out this question: How do you use `ProjectRef` to reference a local project in sbt 1.x?
This blog post should also be helpful:
https://eed3si9n.com/hot-source-dependencies-using-sbt-sriracha

How to re-use compiled sources in different machines

To speed up our development workflow we split the tests and run each part on multiple agents in parallel. However, compiling test sources seem to take most of the time for the testing steps.
To avoid this, we pre-compile the tests using sbt test:compile and build a docker image with compiled targets.
Later, this image is used in each agent to run the tests. However, it seems to recompile the tests and application sources even though the compiled classes exists.
Is there a way to make sbt use existing compiled targets?
Update: To give more context
The question strictly relates to scala and sbt (hence the sbt tag).
Our CI process is broken down in to multiple phases. Its roughly something like this.
stage 1: Use SBT to compile Scala project into java bitecode using sbt compile We compile the test sources in the same test using sbt test:compile The targes are bundled in a docker image and pushed to the remote repository,
stage 2: We use multiple agents to split and run tests in parallel.
The tests run from the built docker image, so the environment is the
same. However, running sbt test causes the project to recompile even
through the compiled bitecode exists.
To make this clear, I basically want to compile on one machine and run the compiled test sources in another without re-compiling
Update
I don't think https://stackoverflow.com/a/37440714/8261 is the same problem because unlike it, I don't mount volumes or build on the host machine. Everything is compiled and run within docker but in two build stages. The file modified times and paths are retained the same because of this.
The debug output has something like this
Initial source changes:
removed:Set()
added: Set()
modified: Set()
Invalidated products: Set(/app/target/scala-2.12/classes/Class1.class, /app/target/scala-2.12/classes/graph/Class2.class, ...)
External API changes: API Changes: Set()
Modified binary dependencies: Set()
Initial directly invalidated classes: Set()
Sources indirectly invalidated by:
product: Set(/app/Class4.scala, /app/Class5.scala, ...)
binary dep: Set()
external source: Set()
All initially invalidated classes: Set()
All initially invalidated sources:Set(/app/Class4.scala, /app/Class5.scala, ...)
Recompiling all 304 sources: invalidated sources (266) exceeded 50.0% of all sources
Compiling 302 Scala sources and 2 Java sources to /app/target/scala-2.12/classes ...
It has no Initial source changes, but products are invalidated.
Update: Minimal project to reproduce
I created a minimal sbt project to reproduce the issue.
https://github.com/pulasthibandara/sbt-docker-recomplile
As you can see, nothing changes between the build stages, other than running in the second stage in a new step (new container).
While https://stackoverflow.com/a/37440714/8261 pointed at the right direction, the underlying issue and the solution for this was different.
Issue
SBT seems to recompile everything when it's run on different stages of a docker build. This is because docker compresses images created in each stage, which strips out the millisecond portion of the lastModifiedDate from sources.
SBT depends on lastModifiedDate when determining if sources have changed, and since its different (the milliseconds part) the build triggers a full recompilation.
Solution
Java 8:
Setting -Dsbt.io.jdktimestamps=true when running SBT as recommended in https://github.com/sbt/sbt/issues/4168#issuecomment-417655678 to workaround this issue.
Newer:
Follow recomendation in https://github.com/sbt/sbt/issues/4168#issuecomment-417658294
I solved the issue by setting SBT_OPTS env variable in the docker file like
ENV SBT_OPTS="${SBT_OPTS} -Dsbt.io.jdktimestamps=true"
The test project has been updated with this workaround.
Using SBT:
I think there is already an answer to this here: https://stackoverflow.com/a/37440714/8261
It looks tricky to get exactly right. Good luck!
Avoiding SBT:
If the above approach is too difficult (i.e. getting sbt test to consider that your test classes do not need re-compiling), you could instead avoid using sbt but instead run your test suite using java directly.
If you can get sbt to log the java command that it is using to run your test suite (e.g. using debug logging), then you could run that command on your test runner agents directly, which would completely preclude sbt re-compiling things.
(You might need to write the java command into a script file, if the classpath is too long to pass as a command-line argument in your shell. I have previously had to do that for a large project.)
This would be a much hackier approach that the one above, but might be quicker to get working.
A possible solution might be defining your own sbt task without dependencies or try to change the test task. For example you could create a task to run a JUnit runner if that was your testing framework. To define a task see this on Implementing Tasks.
You could even go as far as compiling sending the code and running the remotes from the same task as it is any scala code you want. From the sbt reference manual
You could be defining your own task, or you could be planning to redefine an existing task. Either way looks the same; use := to associate some code with the task key

Typesafe config loads wrong configuration

So the problem is really simple and I hope solution will be as well.
So basically I have two configuration files application.conf and dev.conf. I'm passing my config files from command line like that sbt -Dconfig.file=dev.conf.
The problem is when I use ConfigFactory.load from main object(the one which extends App) it loads config I passed via command line(in this case dev.conf), but when I load the config from different object it loads default application.conf.
Can I load somehow config passed from arguments from any object?
When you run your application with the runMain SBT task, then by default SBT won't create a separate JVM for your code. This has several consequences around the application lifecycle, and of course with regard to system properties as well.
In general, your approach should work, as long as your build configuration does not enable forking. However, I think the better approach would be to actually rely on forking and specify the system property explicitly. This is guaranteed to work. To do this, you need to set the fork setting in the run task to true, and then add a JVM command line option:
Compile / run / fork := true,
Compile / run / javaOptions += "-Dconfig.file=dev.conf",
Don't forget to restart SBT after that. You won't need to pass the config.file property to SBT with this approach; rather, it is controlled by the javaOptions setting, as in the example above.

Set the stack size for SBT

I'm running SBT using my specially built Scala. My built Scala Compiler will do a lot of things at runtime, with a lot of function calls, which can be recursive.
So when I run SBT using my built Scala Compiler, stack overflows after a long time. I try to set -J-Xss when starting SBT. But that doesn't work.
I encountered the problem with SBT heap size before. And many posts says setting -J-Xmx when starting SBT won't change the JVM heap size because it is overridden by the default SBT memory options.
How to set heap size for sbt?
Now, I wonder whether -J-Xss can be overridden by default SBT options, just like -J-Xmx being overridden. Or I should simply try to set -J-Xss larger?
There are a number of ways to do this, but it depends what you are trying to achieve. If you want larger heap for running tests for instance, look at the secondary approach undertaken here.
SBT_OPTS
First you can simply set the environment variable SBT_OPTS which SBT will natively look for while loading itself, and this should override any settings that you want to specify.
export SBT_OPTS = "-Xmx1G;-Xms256m;...";
Custom launcher
The other way to achieve the same is to basically create a custom SBT launching script. Have a look at the example here.
For testing
If you want to modify the testing options, you need to use javaOptions in ThisBuild ++= Seq("-Xmx1g", ...). For them to even be run, you always need to have fork in Test := true, which will create a forked JVM for running tests. Without that, the options specified will not be honoured.

Tell SBT not to check a SNAPSHOT version for changes

This is similar to Re-download a SNAPSHOT version of a dependency using SBT, but I would like to achieve the opposite - I would like to tell SBT it does not have to check SNAPSHOT version for changes. How can I do that?
Motivation is when using jME3 via SBT, jME3 does not follow usual conventions and each SNAPSHOT version already gets the version stamp in its name. As there are many components of jME3, checking for each of them for changes seems to slow down the build.
Sbt internally marks all dependencies, which are -SNAPSHOT as changing. You can check the changing() documentation.
I don't think you can change this easily as this seem to be coded in the sources. I think you could set offline to true, which should hold the update process.
You can do it in the build.sbt or just from console via set offline := true.
I think the closest is the offline setting:
Adding the setting offline := true to your build.sbt should disable dependency resolution for snapshots.
To set this globally on your machine, put it in ~/.sbt/0.13/global.sbt
From the documentation:
When offline := true, remote SNAPSHOTs will not be updated by a resolution, even an explicitly requested update. This should effectively support working without a connection to remote repositories. Reproducible examples demonstrating otherwise are appreciated. Obviously, update must have successfully run before going offline.
Just found out about the skip setting that:
For tasks that support it (currently only 'compile' and 'update'), setting skip to true will force the task to not to do its
work. This exact semantics may vary by task.
So use skip in update := true or skip in compile := true to skip the work.