scala binary vs scala full version convention - scala

I am readying the following sbt page:
https://www.scala-sbt.org/1.x/docs/Cross-Build.html#Cross-building
But i find the documentation not that great, and would like to clarify something which i think i understand but is not made explicit in the doc.
What is the convention for :
CrossVersion.binary (_<scala-binary-version>)
CrossVersion.full (_<scala-version>)
In other words i want a simple example of
_<scala-binary-version>
and of
_
is it :
Binary Version example 2.12
Scala version example 2.12.12
Is that the difference between a binary version and the scala version, where the later would include the compiler access and what not ?
EDIT1
The following example is given in the page:
These are equivalent:
"a" %% "b" % "1.0"
("a" % "b" % "1.0").cross(CrossVersion.binary)
This overrides the defaults to always use the full Scala version instead of the binary Scala version:
("a" % "b" % "1.0").cross(CrossVersion.full)
I only understand what is meant is the second statement, because i have been using the first and know what it does. But it is a guess, and i am just looking for an explicit confirmation with an example, which i believe could benefit any new scala dev in their journey.

So 2.12.12 is a full version (as well as a patch version) and 2.12 is a binary version.
Binary versions are useful because libraries compiled using a different but binary compatible version can be used in your project without any problems. For example, if you are using Scala 2.13.3 you can use a library that was compiled using 2.13.0 or 2.13.4 but not one compiled using 2.12.12.
The full version is useful for things that access the underlying compiler API which doesn't retain binary compatibility, for example, compiler plugins like the kind projector.
BTW, just for fun, 2.13.0-RC1 is another full version but not a patch version.

Related

Cross-building Scala libraries

I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.

Trouble adding tensorflow dependency for scala 2.12.11

I am following a tutorial to perform object detection in scala. I am
having issues adding the tensorFlow dependency. I have followed the instructions on the official Tensorflow for Scala website http://platanios.org/tensorflow_scala/installation.html, but that doesn't seem to work. I also made sure to use the Java 11 JDK for the project. However, whenever I try to add the sbt dependency
libraryDependencies += "org.platanios" % "tensorflow" % "0.4.0" classifier "linux-cpu-x86_64", I get a "No dependencies found for given import" error in IntelliJ. Any idea on how to set this up properly ?
Try to replace one % in your dependency line to twice %%:
libraryDependencies += "org.platanios" %% "tensorflow" % "0.4.0" classifier "linux-cpu-x86_64"
On top of what the previous answer already suggested, I believe it's probably worth mentioning that (until 2.12) libraries in the 2.x are not binary-compatible across versions. The convention for Scala libraries is to append a _2.x to the published library JAR's artifact identifier. Since SBT was built around Scala (and it's its de facto standard build tool) it acknowledges this conventions and the %% operator will automatically append that extra "qualifier" based on the Scala version you are using.
Notice here on mvnrepository.com how the artifact identifier changes between the Maven and the SBT dependency declaration (in Maven, the artifact identifier is tensorflow_2.12, in SBT the %% allows you to not have to specify that).
The single % is generally used for Java dependencies (that are not affected by the aforementioned convention).
As an alternative (that I would suggest just to play around and see that there's no magic involved), you can also use % to specify a Scala dependency and explicitly mention the Scala version in the artifact identifier, as follows:
libraryDependencies += "org.platanios" % "tensorflow_2.12" % "0.4.0" classifier "linux-cpu-x86_64"
The good news is that starting from Scala 2.13 this issue was tackled at the very root using an intermediate representation that was also introduced to make sure the interoperability between Scala 2.13 and Scala 3.x compiled code.
EDIT
What you have found was actually an issue in the documentation that was already reported, I opened a PR to fix it.

Is it possible to specify the scala compiler used by sbt?

I modified the source code of scala compiler and built it. Now I want to test this compiler. However, many existing scala projects use sbt as build tool. So I wonder if it is possible to replace the official scala compiler used by sbt with the scala compiler built by myself.
See http://www.scala-sbt.org/1.0/docs/Configuring-Scala.html#Using+Scala+from+a+local+directory:
The result of building Scala from source is a Scala home directory <base>/build/pack/ that contains a subdirectory lib/ containing the Scala library, compiler, and other jars. The same directory layout is obtained by downloading and extracting a Scala distribution. Such a Scala home directory may be used as the source for jars by setting scalaHome. For example,
scalaHome := Some(file("/home/user/scala-2.10/"))
If you want to publish the compiler, use #ipoteka's answer.
According to docs it's straightforward:
managedScalaInstance := false
libraryDependencies += "yourPackage" % "yourScalaCompiler" % version
Don't forget to publish-local you compiler first.

UnsupportedClassVersionError on play with JDK 1.7

I am getting the same error as this post. i'm trying to resolve the problem as mentioned in the proposed solution but i didn't understand how ?
If you are using version 2.4.x (or newer), you must use Java 8. From the Highlights of version 2.4:
Play 2.4 now requires JDK 8. Due to this, Play can, out of the box, provide support for Java 8 data types. For example, Play’s JSON APIs now support Java 8 temporal types including Instance, LocalDateTime and LocalDate.
To confirm that you are using Play 2.4, see file project/plugins.sbt.
Edit:
If you can't (or don't want to) use Java 8, you have to use Play 2.3 instead. To do so, you must edit project/plugins.sbt to change the used version of Play:
// Notice we are now using version 2.3.10
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.10")
If this is a brand new project, you can recreate it using 2.3 template instead:
activator new play-scala-2.3 name-of-your-project
Or, for Java:
activator new play-java-2.3 name-of-your-project

Is Scala's inability to mix version in library fixed?

I have learned that Scala is suffering from a limitation, that all Scala bytecodes needs to be generated from same compiler version. e.g. I cannot have a library built for 2.9 to work with my application which is built by 2.9.1
http://lift.la/blog/scalas-version-fragility-make-the-enterprise
I tried to search from the web for more discussion on this issue but cannot find much updates. Is this issue, as in Scala 2.11.6, fixed in any extend?
In Scala, the 'middle' number in the version string is the major version, so in 2.10.x and 2.11.x, the major version is 10 and 11 respectively.
Major versions are binary compatible. Therefore, if you have a library compiled against Scala 2.11.0, you can safely use it in a project that uses 2.11.6 without recompilation, and vice versa. If your library was compiled for Scala 2.10.5, you would have to compile it newly to use in a Scala 2.11.x project.
If your code doesn't call into deprecated API, it should be source compatible with the subsequent major version.
Most libraries are published for at least two major versions at the same time, so there is quite a bit of elasticity. Take an example, Scalaz, it has its latest artifacts cross-built for Scala 2.9.3, Scala 2.10.x, and Scala 2.11.x.