Tell SBT not to check a SNAPSHOT version for changes - scala

This is similar to Re-download a SNAPSHOT version of a dependency using SBT, but I would like to achieve the opposite - I would like to tell SBT it does not have to check SNAPSHOT version for changes. How can I do that?
Motivation is when using jME3 via SBT, jME3 does not follow usual conventions and each SNAPSHOT version already gets the version stamp in its name. As there are many components of jME3, checking for each of them for changes seems to slow down the build.

Sbt internally marks all dependencies, which are -SNAPSHOT as changing. You can check the changing() documentation.
I don't think you can change this easily as this seem to be coded in the sources. I think you could set offline to true, which should hold the update process.
You can do it in the build.sbt or just from console via set offline := true.

I think the closest is the offline setting:
Adding the setting offline := true to your build.sbt should disable dependency resolution for snapshots.
To set this globally on your machine, put it in ~/.sbt/0.13/global.sbt
From the documentation:
When offline := true, remote SNAPSHOTs will not be updated by a resolution, even an explicitly requested update. This should effectively support working without a connection to remote repositories. Reproducible examples demonstrating otherwise are appreciated. Obviously, update must have successfully run before going offline.

Just found out about the skip setting that:
For tasks that support it (currently only 'compile' and 'update'), setting skip to true will force the task to not to do its
work. This exact semantics may vary by task.
So use skip in update := true or skip in compile := true to skip the work.

Related

Is there "any" possible way to Sync a local Scala library with our current project in order to have some sort of "hot reloading" of it?

I've been creating some projects in Scala, and there might be already several components that I constantly use , reuse or implement in different ways, I want to start putting all that stuff in some sort of a library but the problem is that I really want to have the chance to check its implementation while working, like the nice "Hot reloading" that the revolver plugin brings whenever we require to see the changes of our code in the console
For now its clear that whenever I want to publish something setup my local build.sbt file and publish it
sbt publishLocal
And then bring them as
"eu.myproject" %% "my-lib" % "1.0.0"
But I really would appreciate a way to work with this libraries with some real time sync in order to see the changes without having to publish them for each change
UPDATE
So thanks to Matthias Berndt I manage to update a project , with some nice hot reload still with revolver by configuring the sbt file as
lazy val root = Project ....
.dependsOn(
ProjectRef(file("/HOME/my-lib"), "my-lib"))
I still will research a nice pattern to bring some more local and published libraries in order to have them in dev and prod
You can use a ProjectRef to add the library as a subproject to the build system of the program that uses the library.
Check out this question: How do you use `ProjectRef` to reference a local project in sbt 1.x?
This blog post should also be helpful:
https://eed3si9n.com/hot-source-dependencies-using-sbt-sriracha

Finding the latest build version of Library

I think I am missing something, but I want to add library from github to Android, I don't see anywhere on the Github page the latest built version of the library so I can include it in my gradle file. I have to go to maven or jetpack manually and search for it. Is there a shortcut? Am I missing something?
Thanks
There is a Lint check which allow Android Studio to query the latest versions available.
First you will have to activate this Lint Check
Go to Settings, then Editor > Inspections and search for Newer Library Version Available and check it.
Then run a code Analyze with Analyze > Run Inspection by Name... and type newer and select Newer Library Version Available
Run the inspection on the wanted scope (module only, full project, etc...)
Then you will see which library has a new version available.
PS
As stated by the Lint description of this feature, you should not let this check activated because it may slow your code analysis (query the repositories can take time)
You can use the + annotation to get a dynamic version. It can be use for the major, minor and patch part of the version. Ex :
// Major
compile group: 'org.mockito', name: 'mockito-core', version: '+'
// Minor
compile group: 'org.mockito', name: 'mockito-core', version: '2.+'
// Patch
compile group: 'org.mockito', name: 'mockito-core', version: '2.18.+'
But it's not a good practice to use such a dependency resolution.
Dependencies can unexpectedly introduce behavior changes to your app. Read your changelogs carefully!
The same source built on two different machines can differ. How many times have you said "but it works on my machine?"
Similarly, builds built on the same machine but at different times can differ. I've wasted so much time on builds that worked one minute
then broke the next.
Past builds cannot be reproduced perfectly. This makes it difficult to revert safely.
There are security implications if a bad actor introduces a malicious version of a dependency.

Change version in build.sbt custom task

I have a custom task defined in build.sbt:
val doSmth = taskKey[Unit]("smth")
doSmth := {
version := "1.0-SNAPSHOT"
}
But it does not change version. What I actually wanted is to have custom sbt publish task which will publish always the same version to repo I added. Beside that, normal sbt assembly process is using incremental version scheme. So I wanted to make task which will set version, execute sbt publish and then return version to previous value. I've done everything but am blocked in changing version. Is there any way to change value of sbt settings (since they are vals) from build.sbt?
Is this possible at all? I guess I could also copy code from sbt publish command (as someone mentioned on one topic) but that's the worst solution in my opinion.
Thanks
I found one possible solution, by changing version for sbt publish task, but it is really strange and unintuitive in SBT. For example, I tried
version := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in publish := "SNAPSHOT-1.0"
I also tried defining different version in Test and Compile configuration with:
version in Compile := sys.env.getOrElse("BUILD_NUMBER", "1.0")
version in Test: = "SNAPSHOT-1.0
but I could not get it to work. SBT behaves really strange. Is it possible at all to use different value of some setting in one task than it's value in all other tasks?
version is a Setting in sbt. Probably the distinction between settings and tasks is the number one thing you need to understand in sbt, else you are going to have really hard time using the tool.
Settings are immutable and initialized when sbt starts (more or less). After that, they cannot change. Tasks on the other hand, are like functions. Everytime you call them, they get re-evaluated.
You see now that it's impossible to have a task mutate a setting. It just doesn't make sense in sbt.
What you can do though, is overshadow a setting in the context of task. This is exactly what you did with version in publish := "SNAPSHOT-1.0". I don't think there is any other better way to do this.

how to disclude development.conf from docker image creation of play framework application artifact

Using scala playframework 2.5,
I build the app into a jar using sbt plugin PlayScala,
And then build and pushes a docker image out of it using sbt plugin DockerPlugin
Residing in the source code repository conf/development.conf (same where application.conf is).
The last line in application.conf says include development which means that in case development.conf exists, the entries inside of it will override some of the entries in application.conf in such way that provides all default values necessary for making the application runnable locally right out of the box after the source was cloned from source control with zero extra configuration. This technique allows every new developer to slip right in a working application without wasting time on configuration.
The only missing piece to make that architectural design complete is finding a way to exclude development.conf from the final runtime of the app - otherwise this overrides leak into production runtime and obviously the application fails to run.
That can be achieved in various different ways.
One way could be to some how inject logic into the build task (provided as part of the sbt pluging PlayScala I assume) to exclude the file from the jar artifact.
Other way could be injecting logic into the docker image creation process. this logic could manually delete development.conf from the existing jar prior to executing it (assuming that's possible)
If you ever implemented one of the ideas offered,
or maybe some different architectural approach that gives the same "works out of the box" feature, please be kind enough to share :)
I usually have the inverse logic:
I use the application.conf file (that Play uses by default) with all the things needed to run locally. I then have a production.conf file that starts by including the application.conf, and then overrides the necessary stuff.
for deploying to production (or staging) I specify the production/staging.conf file to be used
This is how I solved it eventually.
conf/application.conf is production ready configuration, it contains placeholders for environment variables whom values will be injected in runtime by k8s given the service's deployment.yaml file.
right next to it, conf/development.conf - its first line is include application.conf and the rest of it are overrides which will make the application run out of the box right after git clone by a simple sbt run
What makes the above work, is the addition of the following to build.sbt :
PlayKeys.devSettings := Seq(
"config.resource" -> "development.conf"
)
Works like a charm :)
This can be done via the mappings config key of sbt-native-packager:
mappings in Universal ~= (_.filterNot(_._1.name == "development.conf"))
See here.

How to force sbt to fetch everything it needs once?

I'm working in a project in a very security concious place with no access via proxy to all the online repositories SBT usually requires. We'd like to fetch the dependencies and transitive dependencies we need once.
How can sbt be forced to fetch all the dependencies a project needs once and from there on, only work offline? I have tried doing exactly that from home. I then copied over everything under:
~/.ivy2/cache
~/.ivy2/local
$ACTIVATOR_HOME/repository
but still SBT even when executed with sbt "set offline := true" run goes and tries to fetch everything online ... is a pain. Then finally breaks and complains it doesn't find some dependency.
UPDATE: I noticed another source of troubles but can't yet conclude it is the culprit of the OP broken build issue. I build and get the dependencies for the project from a Linux (Ubuntu box) and then I copy all the files to the corporate Windows 7 Pro environment. I found that many property files under ~/.ivy2/cache refer to the absolute path of the activator repository directory in Ubuntu and this is of course incorrect in the Windows env e.g.
#ivy cached data file for ch.qos.logback#logback-classic;1.1.3
#Fri Mar 10 08:39:37 CET 2017
artifact\:ivy\#ivy.original\#xml\#-1844423371.location=/opt/dev/activator/1.3.12/repository/ch.qos.logback/logback-classic/1.1.3/ivys/ivy.xml
artifact\:ivy\#ivy\#xml\#1016118566.is-local=true
artifact\:ivy\#ivy\#xml\#1016118566.location=/opt/dev/activator/1.3.12/repository/ch.qos.logback/logback-classic/1.1.3/ivys/ivy.xml
artifact\:ivy\#ivy.original\#xml\#-1844423371.is-local=true
artifact\:ivy\#ivy\#xml\#1016118566.exists=true
artifact\:logback-classic\#jar\#jar\#804750561.is-local=true
artifact\:logback-classic\#jar\#jar\#804750561.location=/opt/dev/activator/1.3.12/repository/ch.qos.logback/logback-classic/1.1.3/jars/logback-classic.jar
artifact\:ivy\#ivy.original\#xml\#-1844423371.exists=true
artifact\:logback-classic\#jar\#jar\#804750561.exists=true
So I went and did a find and replace but the build still doesn't work. It doesn't look like a brilliant idea to have thousands of property files hardcoding an absolute path to the activator location. I would rather prefer they use an environment variable for that.
Maybe you could try coursier?
No only it offers
better offline mode - one can safely work with snapshot dependencies if these are in cache (SBT tends to try and fail if it cannot check for updates)
but also is much faster than Ivy due to parallel artifacts downloading. The project is young but promising.