in Build.scala, you can specify that a library dependency should be updated each time by using the changing method. For example:
"com.github.seratch" %% "scalikesolr" % "4.6.0" changing()
Is there a way to do a similar thing for project references to external URI. For example:
RootProject(uri("git://git#github.com/Somewhere/project.git"))
Updates don't seem to trigger the project reference to update.
This is a known issue with sbt, see: https://github.com/sbt/sbt/issues/1120
Related
I am following a tutorial to perform object detection in scala. I am
having issues adding the tensorFlow dependency. I have followed the instructions on the official Tensorflow for Scala website http://platanios.org/tensorflow_scala/installation.html, but that doesn't seem to work. I also made sure to use the Java 11 JDK for the project. However, whenever I try to add the sbt dependency
libraryDependencies += "org.platanios" % "tensorflow" % "0.4.0" classifier "linux-cpu-x86_64", I get a "No dependencies found for given import" error in IntelliJ. Any idea on how to set this up properly ?
Try to replace one % in your dependency line to twice %%:
libraryDependencies += "org.platanios" %% "tensorflow" % "0.4.0" classifier "linux-cpu-x86_64"
On top of what the previous answer already suggested, I believe it's probably worth mentioning that (until 2.12) libraries in the 2.x are not binary-compatible across versions. The convention for Scala libraries is to append a _2.x to the published library JAR's artifact identifier. Since SBT was built around Scala (and it's its de facto standard build tool) it acknowledges this conventions and the %% operator will automatically append that extra "qualifier" based on the Scala version you are using.
Notice here on mvnrepository.com how the artifact identifier changes between the Maven and the SBT dependency declaration (in Maven, the artifact identifier is tensorflow_2.12, in SBT the %% allows you to not have to specify that).
The single % is generally used for Java dependencies (that are not affected by the aforementioned convention).
As an alternative (that I would suggest just to play around and see that there's no magic involved), you can also use % to specify a Scala dependency and explicitly mention the Scala version in the artifact identifier, as follows:
libraryDependencies += "org.platanios" % "tensorflow_2.12" % "0.4.0" classifier "linux-cpu-x86_64"
The good news is that starting from Scala 2.13 this issue was tackled at the very root using an intermediate representation that was also introduced to make sure the interoperability between Scala 2.13 and Scala 3.x compiled code.
EDIT
What you have found was actually an issue in the documentation that was already reported, I opened a PR to fix it.
I am using a library say A in Scala which is dependent on version x.11 of another library say Z.
Now, I am also using a library say B which is dependent on version x.31 of Z.
This leads to compile error because we will have two versions of library Z, how can I use both libraries A and B in scala's sbt? Is there any way to specify it.
If completely replacing one dependency with a newer version happens to work, then Sparko's solution works. However, that isn't always the case.
If you want to include both versions of a library in the uber-jar produced by sbt-assembly, you'll need to use shading. See this post for an overview of what shading is, and some of the drawbacks associated with it.
Shading is covered in sbt-assembly's documentation here, but if you're anything like me, their way of explaining it will leave you more confused than you started. There's a good blog post, Spark, Uber Jars and Shading with sbt-assembly (waybackmachine link), that helps to demystify it a bit. Here's the relevant section:
I can shade over my typesafe config version, giving it a different
name so Spark won’t get confused between the versions. I quickly went
to my build.sbt file, and added the following code:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "my_conf.#1")
.inLibrary("com.typesafe" % "config" % "1.3.0")
.inProject )
According to the documentation, this should place any class under
com.typesafe.config under the new package my_conf.
For your case, the solution would be adding something like this to your build.sbt file:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.somecompany.**" -> "my_conf.#1")
.inLibrary("com.somecompany" % "libraryZ" % "0.11")
.inProject
)
In sbt, conflicts between libraries are configured using the conflict manager. By default, the latest revision is selected but this can also be overridden in you .sbt file:
conflictManager := ConflictManager.strict
If you're using sbt 0.13.6 or greater you will be warned when you have an incompatible binary version between your dependencies. In this situation, you could configure an override in your sbt file for the specific library:
dependencyOverrides += "org.raman" % "Z" % "x.11"
This will force the resolved version of Z to x.11 but not pull a direct dependency in.
I'm experiencing a kind of impedance mismatch between sbt and bintray-sbt plugin. The plugin is published via bintray-sbt at https://bintray.com/artifact/download/synapse/sbt-plugins/me/synapse/my-sbt-plugin/0.0.1/my-sbt-plugin-0.0.1.pom (publishMavenStyle set to true. If set to false a different directory structure is created but still not the one sbt expects). Test project has
resolvers += Resolver.bintrayRepo("synapse", "sbt-plugins")
addSbtPlugin("me.synapse" % "my-sbt-plugin" % "0.0.1")
in project/plugins.sbt and sbt tries to download https://dl.bintray.com/synapse/sbt-plugins/me/synapse/my-sbt-plugin_2.10_0.13/0.0.1/my-sbt-plugin-0.0.1.pom
What settings should be used in plugin build definition to a) be able to test it from current repository and b) to be able to link it to sbt-plugin-releases repo when the time comes?
UPD: It looks like after the package was linked to sbt-plugin-releases it ended up in proper directory structure.
Is there a command in the SBT console that forces it to resolve artifacts (especially, re-resolve SNAPSHOT dependencies)? The only way I know of now is to run clean and then compile (or start), but this takes much longer and isn't always necessary.
You can mark the needed dependencies to re-check them on update:
libraryDependencies ++= {
"org.specs2" %% "specs2" % "1.10-SNAPSHOT" % "test" changing()
}
Re-download a SNAPSHOT version of a dependency using SBT
The update command should help.
From the task's documentation:
Resolves and optionally retrieves dependencies, producing a report.
See Dependency Management Flow.
What's more important, SNAPSHOT dependencies are in their nature changing() so there's no need to add anything after ModuleID to mark them as such. Every update is supposed to resolve them against the repositories.
Perhaps update-classifiers is what you are looking for? Otherwise, try the tasks command to see what's available.
I want to use org.apache.commons.lang.NotImplementedException as it seems to be the only NotImplementedException implementation in Java/Scala domain. I can remember I used to use it with Scala 2.8.1 with no hacks. But now it says "object lang is not a member of package org.apache.commons". Where has org.apache.commons.lang gone?
I've just found the answer myself. The problem is Apache Commons 3 no longer include lang (including lang3 instead, which is differend and doesn't contain NotImplementedException), so we need Apache Commons 2.6. And what's inobvious here is that the Maven group id for it is not org.apache.commons, but commons-lang - the same as its artifact id.
So I had to add "commons-lang" % "commons-lang" % "2.6" dependency and do sbt update to make it work.