I have a previously working Scala script that when I try to run it on a new PC, the compilation fails.
So I made simple script to test:
#!/bin/sh
exec scala -J-Xmx2g "$0" "$#"
!#
println("test")
And trying to run it I get:
test.scala
error: Compile server encountered fatal condition: java.nio.ByteBuffer.clear()Ljava/nio/ByteBuffer;
java.lang.NoSuchMethodError: java.nio.ByteBuffer.clear()Ljava/nio/ByteBuffer;
at scala.tools.nsc.io.SourceReader.read(SourceReader.scala:61)
at scala.tools.nsc.io.SourceReader.read(SourceReader.scala:40)
at scala.tools.nsc.io.SourceReader.read(SourceReader.scala:49)
at scala.tools.nsc.Global.getSourceFile(Global.scala:395)
at scala.tools.nsc.Global.getSourceFile(Global.scala:401)
at scala.tools.nsc.Global$Run$$anonfun$30.apply(Global.scala:1607)
at scala.tools.nsc.Global$Run$$anonfun$30.apply(Global.scala:1607)
at scala.collection.immutable.List.map(List.scala:284)
at scala.tools.nsc.Global$Run.compile(Global.scala:1607)
at scala.tools.nsc.StandardCompileServer.session(CompileServer.scala:151)
at scala.tools.util.SocketServer$$anonfun$doSession$1$$anonfun$apply$1.apply(SocketServer.scala:74)
at scala.tools.util.SocketServer$$anonfun$doSession$1$$anonfun$apply$1.apply(SocketServer.scala:74)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at scala.Console$.withOut(Console.scala:65)
at scala.tools.util.SocketServer$$anonfun$doSession$1.apply(SocketServer.scala:74)
at scala.tools.util.SocketServer$$anonfun$doSession$1.apply(SocketServer.scala:69)
at scala.tools.nsc.io.Socket.applyReaderAndWriter(Socket.scala:49)
at scala.tools.util.SocketServer.doSession(SocketServer.scala:69)
at scala.tools.util.SocketServer.loop$1(SocketServer.scala:85)
at scala.tools.util.SocketServer.run(SocketServer.scala:97)
at scala.tools.nsc.CompileServer$$anonfun$execute$2$$anonfun$apply$mcZ$sp$1.apply$mcZ$sp(CompileServer.scala:218)
at scala.tools.nsc.CompileServer$$anonfun$execute$2$$anonfun$apply$mcZ$sp$1.apply(CompileServer.scala:213)
at scala.tools.nsc.CompileServer$$anonfun$execute$2$$anonfun$apply$mcZ$sp$1.apply(CompileServer.scala:213)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at scala.Console$.withOut(Console.scala:53)
at scala.tools.nsc.CompileServer$$anonfun$execute$2.apply$mcZ$sp(CompileServer.scala:213)
at scala.tools.nsc.CompileServer$$anonfun$execute$2.apply(CompileServer.scala:213)
at scala.tools.nsc.CompileServer$$anonfun$execute$2.apply(CompileServer.scala:213)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at scala.Console$.withErr(Console.scala:80)
at scala.tools.nsc.CompileServer$.execute(CompileServer.scala:212)
at scala.tools.nsc.CompileServer$.main(CompileServer.scala:180)
at scala.tools.nsc.CompileServer.main(CompileServer.scala)
It seems like Scala is compiling something near my script, but I don't quite know how to debug it and fix it.
TL;DR
Ubuntu's Scala package used to be incompatible with Java 8 (this has been fixed in 2.11.12-4). The solution was to uninstall Ubuntu's Scala package and install one of the official Scala packages. You might still want to do that, this time around, not due to incompatibility with Java, but because Ubuntu's latest packaged Scala version is still 2.11, while Scala's latest version is currently 2.13.
sudo apt remove scala-library scala
wget https://downloads.lightbend.com/scala/2.13.4/scala-2.13.4.deb
sudo dpkg -i scala-2.13.4.deb
Since many people were asking for the reason behind this issue and I was also curious about what caused it, I did some digging...
The root of the problem
In Java 9, Buffer subclasses (including ByteBuffer) were changed to override methods that in the superclass return Buffer to return the respective subtype.
Bug: https://bugs.java.com/bugdatabase/view_bug.do?bug_id=JDK-4774077
Commit: https://github.com/AdoptOpenJDK/openjdk-jdk9/commit/d9d7e875470bf478110b849315b4fff55b4c35cf
This change is not binary backward compatible. If some Java code which calls one these methods directly in one of Buffer's subclasses is compiled with JDK9+, the generated bytecode will not run in JRE8 (even if the returned value is not used at all). This happens because the signature of the method when called will be compiled as java.nio.ByteBuffer.clear()Ljava/nio/ByteBuffer which doesn't exist in JRE8. However, if compiled with JDK8, the signature compiled into bytecode would be java/nio/ByteBuffer.clear:()Ljava/nio/Buffer which exists in the Buffer calss in both JRE8 and JRE9+.
Where did Scala go wrong? (Or did it?)
Scala compiler does use some of the methods affected by the changes above. Particularly, in the SourceReader class where the error in OP's question happened.
Looking at Scala's compatibility matrix, it says that we need at least Scala 2.11.12 to use JDK11, but it doesn't say much explicitly about the opposite direction of compatibility. It does say though that "Scala 2.12+ definitely doesn't work at all on JDK 6 or 7", so we could expect that 2.12+ is still compatible with JDK8, and even more so Scala 2.11.
Why did they break the compatibility then? Couldn't they just compile Scala's source code with an older JDK version? They didn't and they could, so much, that they still do it.
If we download one of the official Scala packages and check the manifest file for scala-compiler.jar, this is what we find:
Scala 2.11.12:
Bundle-Name: Scala Compiler
Bundle-RequiredExecutionEnvironment: JavaSE-1.6, JavaSE-1.7
Bundle-SymbolicName: org.scala-lang.scala-compiler
Bundle-Version: 2.11.12.v20171031-225310-b8155a5502
Created-By: 1.6.0_45 (Sun Microsystems Inc.)
Scala 2.13.4:
Bundle-Name: Scala Compiler
Bundle-RequiredExecutionEnvironment: JavaSE-1.8
Bundle-SymbolicName: org.scala-lang.scala-compiler
Bundle-Version: 2.13.4.v20201117-181115-VFINAL-39148e4
Created-By: 1.8.0_275 (AdoptOpenJDK)
So it seems Scala 2.11 is still being compiled with JDK6 and Scala 2.13 is still being compiled with JDK8. Shouldn't that mean that they are both compatible with JRE8? Yes and indeed they are. Where's the error coming from then?
Where did Ubuntu go wrong?
Ubuntu, as most other Linux distributions do, likes to build its own packages that are made available through its package manager. This is done to ensure that everything works properly within the OS ecosystem, and that often means patching the source code of upstream projects.
Regarding the Scala package in particular, Ubuntu decided to ditch the upstream choices of JDK versions used to compile the Scala source code and has been using newer JDK versions to compile Ubuntu's Scala package for a while.
If we check the manifest file for scala-compiler.jar in Ubuntu's Scala 2.11.12-4, we can see that is was compiled with JDK11:
Created-By: 11.0.2+9-Ubuntu-3ubuntu1 (Oracle Corporation)
Bundle-Name: Scala Distribution
Bundle-SymbolicName: org.scala-ide.scala.compiler;singleton:=true
Bundle-Version: 2.11.12
Didn't you say the issue was resolved in 2.11.12-4? Yes, I did.
Ubuntu's solution for this problem was not to compile Scala with JDK8, but rather to patch Scala's source code to avoid calling the problematic methods directly in the subclasses. This was achieved by casting ByteBuffer (and CharBuffer) to its superclass Buffer before calling these methods. In practice, that meant changing Scala's source code from bytes.clear() to bytes.asInstanceOf[Buffer].clear().asInstanceOf[ByteBuffer] (not sure why they cast it back to ByteBuffer when the result from clear() doesn't seem to be used at all). Here is Ubuntu's patch.
Ubuntu's approach seems a bit dangerous, because other sources of incompatibility could have gone unnoticed and still be there waiting to happen in some very specific situation. Also having their own setup different from the official Scala releases means not having the whole Scala community testing these changes in real-case scenarios.
It works for me by disabling fsc with version 2.11.12:
#!/usr/bin/env -S scala -nc
Related
I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.
There are multiple binary incompatible scala 2 versions, however the document says the installation is either via IDE or SBT.
DOWNLOAD SCALA 2
Then, install Scala:...either by installing an IDE such as IntelliJ, or sbt, Scala's build tool.
Spark 3 needs Scala 2.12.
Spark 3.1.2 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).
Then how can we make sure the scala version is 2.12 if we install sbt?
Or the documentation is not accurate and it should be "to use specific version of scala, need to download specific scala version on your own"?
Updates
As per the answer by mario-galic, in ONE-CLICK INSTALL FOR SCALA it is said:
Installing Scala has always been a task more challenging than necessary, with the potential to drive away beginners. Should I install Scala itself? sbt? Some other build tools? What about a better REPL like Ammonite? Oh and before all that I need to install Java?
To solve this problem, the Scala Center contracted Alexandre Archambault in January 2020 to add a one-click install of Scala through coursier. For example, on Linux, all we now need is:
$ curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
The Scala version is specified in the build.sbt file so SBT will download the appropriate version of Scala as necessary.
I personally use SDKMAN! to install Java and then SBT.
The key concept to understand is the difference between system-wide installation and project-specific version. System-wide installation ends up somewhere on the PATH like
/usr/local/bin/scala
and can be installed in various ways, personally I recommend coursier one-click install for Scala
curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
Project-specific version is specified by scalaVersion sbt settings which downloads Scala to coursier cache location. To see the Scala version and location used by the particular project try show scalaInstance which
inspect scalaInstance
[info] Task: sbt.internal.inc.ScalaInstance
[info] Description:
[info] Defines the Scala instance to use for compilation, running, and testing.
Scala should be binary compatible within minor version so Spark 3 or any other software built against any Scala 2.12.x version should work with any other Scala 2.12.x version where we have major.minor.patch. Note binary compatibility is not guaranteed for internal compiler APIs, so for example when publishing compiler plugins the best practice is to publish it against full specific Scala version. For example notice how kind-projector compiler plugin is published against full Scala version 2.13.6
https://repo1.maven.org/maven2/org/typelevel/kind-projector_2.13.6/
whilst scala-cats application-level library is published against any Scala 2.13.x version
https://repo1.maven.org/maven2/org/typelevel/cats-core_2.13/
Similarly spark is published against any Scala 2.12.x version
https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/
Regarding system-wide installation one trick I do for quick switching of versions is to put scala-runners on the PATH and then different versions can be launched via --scala-version argument
scala --scala-version 2.12.14
Using coursier or scala-runners you can even switch JDK quickly via -C--jvm for example
scala --scala-version 2.12.14 -C--jvm=11
For a project, there should be no need to download manually a specific version of Scala. sbt either directly or indirectly via an IDE will download all the dependencies behind the scenes for you, so the only thing to specify is sbt setting scalaVersion.
Using Python as analogy to Scala, and Pipenv as anology to sbt, then python_version in Pipfile is similar to scalaVersion in build.sbt. After executing pipenv shell and pipenv install you end up with project-specific shell environment with project specific Python version and dependencies. sbt similarly downloads project specific Scala version and dependencies based of build.sbt although it has no need for lock files or for modifying your shell environment.
https://docs.scala-lang.org/overviews/compiler-options/index.html says
Scala compiler scalac offers various compiler options, also referred to as compiler flags, to change how to compile your program.
Nowadays, most people are not running scalac from the command line. Instead, they use sbt, an IDE, and other tools as their interface to the compiler. Therefore they may not even have scalac installed, and won’t think to do man scalac.
Does "the compiler" refer to scalac?
If yes, is "they use sbt, an IDE, and other tools as their interface to the compiler" contrary to "therefore they may not even have scalac installed"?
Does sbt rely on scalac?
Thanks.
Scala compiler can be accessed programmatically via an API packaged by scala-compiler.jar dependency, hence tools such as IDEs and SBT can implement their own client frontends over this API to drive compiler functionality. scalac is just a bash script that executes scala.tools.nsc.MainClass class from scala-compiler.jar.
Does sbt rely on scalac?
No, sbt uses compiler API directly. One of the key concepts to understand regarding sbt is that the build definition is itself Scala code, either vanilla or DSL, but Scala nevertheless. The version of Scala used to compile the build definition is separate from the version of Scala used to compile project proper. The build definition source code in build.sbt and project/*.scala will be compiled using Scala version specified indirectly via sbt.version=1.2.8 setting in project/build.properties, whilst project source code proper in src/main/scala/* will be compiled using Scala version specified directly via scalaVersion := "2.13.1" setting in build.sbt. Note how they can indeed differ. Think of the build definition as simply another Scala app which uses sbt API for its implementation.
I have learned that Scala is suffering from a limitation, that all Scala bytecodes needs to be generated from same compiler version. e.g. I cannot have a library built for 2.9 to work with my application which is built by 2.9.1
http://lift.la/blog/scalas-version-fragility-make-the-enterprise
I tried to search from the web for more discussion on this issue but cannot find much updates. Is this issue, as in Scala 2.11.6, fixed in any extend?
In Scala, the 'middle' number in the version string is the major version, so in 2.10.x and 2.11.x, the major version is 10 and 11 respectively.
Major versions are binary compatible. Therefore, if you have a library compiled against Scala 2.11.0, you can safely use it in a project that uses 2.11.6 without recompilation, and vice versa. If your library was compiled for Scala 2.10.5, you would have to compile it newly to use in a Scala 2.11.x project.
If your code doesn't call into deprecated API, it should be source compatible with the subsequent major version.
Most libraries are published for at least two major versions at the same time, so there is quite a bit of elasticity. Take an example, Scalaz, it has its latest artifacts cross-built for Scala 2.9.3, Scala 2.10.x, and Scala 2.11.x.
When trying to compile my Game with roboVM, I keep getting the error:
java.lang.IncompatibleClassChangeError: class org.robovm.compiler.plugin.objc.ObjCProtocolProxyPlugin$1 has interface org.objectweb.asm.ClassVisitor as super class
I have investigated some many hours, coming to the conclusion that it has to do with the ASM library: In the library ASM, up to version 3.3.2, the class ClassVisitor was an interface. It got promoted to an abstract class in 4.0 and the robovm backend bytecode uses a >= 4.0 version while my SBT builder tries to use a version < 4.0.
The roboVM code in question can be found here: https://github.com/robovm/robovm/blob/master/compiler/src/main/java/org/robovm/compiler/plugin/objc/ObjCProtocolProxyPlugin.java#L145
Now, while I realize that this is the issue, I have no idea how to fix / work around it. I do not want to compile libGDX from source...
To setup my app I used existing templates, namely this one: https://github.com/ajhager/libgdx-sbt-project.g8. Also, I use the latest versions respectively:
sbt 0.13.5
libGDX 1.4.1
scala 2.11.3
roboVM 1.0.0-alpha-04
Now when I investigated further, searching for the culprit in this conglomerate, I found that indeed two 'asm's were included in the classpath, the one with version 3.3.1 being mentioned earlier:
scalac -classpath ...:~/.ivy2/cache/asm/asm-all/jars/asm-all-3.3.1.jar:...:~/.ivy2/cache/org.ow2.asm/asm-all/jars/asm-all-4.2.jar:...
This obviously caused the crash. Now I only had to find the place where 3.3.1 was set as dependency and I was rather quick in finding it, at long last: pfn/android-sdk-plugin. For whatever reason, they set this as a dependency (albeit somehow not using it in their code). There were evidently no conflicts since the group ids differed: asm:asm-all:3.3.1 vs org.ow2.asm-all:4.2.
This is easily the dumbest thing I have ever walked across and I'm grinding my teeth that it took so long and so much debugging to get behind it. Hmpf!
I fixed it by cloning the android-sdk-plugin repository and adjusting the ASM version / group id to 4.2. I then continued to sbt publish-local and increased the version number dependency in my project to fit the cloned SNAPSHOT version's.
I hope this will help anyone that stumbles across this behaviour.
So long,
Danyel.