Quickly switching Scala versions from CLI - scala

nvm (Node Version Manager) enables quick selection of alternative Node version to work with. Does Scala have a similar way of quickly switching different versions in the current shell? For example, say I want to start REPL with 2.12.10, then executing something something like
scala use 2.12.10
would greet with
Welcome to Scala 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_202).
Type in expressions for evaluation. Or try :help.
Note the question is not about SBT via scalaVersion, but using scala command directly from the command line.

Alternatively to scala-runners, there's also sdkman, which is a more general tool but also supports scala. For example, command:
sdk use scala 2.12.10
sets scala to 2.12.10 for current shell session.

dwijnand/scala-runners
An alternative implementation of the Scala distribution's runners:
scala, scalac, scaladoc, and scalap (no fsc). They are implemented as
thin shell scripts around Coursier's coursier launch to add some Scala
runners-specific (power) options.
provide a quick way of starting different versions
scala --scala-version 2.12.10
and even development versions from
https://scala-ci.typesafe.com/artifactory/scala-integration
https://scala-ci.typesafe.com/artifactory/scala-pr-validation-snapshots
for example
scala --scala-version 2.13.2-bin-81d1da3
or particular pull request
scala --scala-pr 8960

Related

How to install scala 2.12

There are multiple binary incompatible scala 2 versions, however the document says the installation is either via IDE or SBT.
DOWNLOAD SCALA 2
Then, install Scala:...either by installing an IDE such as IntelliJ, or sbt, Scala's build tool.
Spark 3 needs Scala 2.12.
Spark 3.1.2 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).
Then how can we make sure the scala version is 2.12 if we install sbt?
Or the documentation is not accurate and it should be "to use specific version of scala, need to download specific scala version on your own"?
Updates
As per the answer by mario-galic, in ONE-CLICK INSTALL FOR SCALA it is said:
Installing Scala has always been a task more challenging than necessary, with the potential to drive away beginners. Should I install Scala itself? sbt? Some other build tools? What about a better REPL like Ammonite? Oh and before all that I need to install Java?
To solve this problem, the Scala Center contracted Alexandre Archambault in January 2020 to add a one-click install of Scala through coursier. For example, on Linux, all we now need is:
$ curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
The Scala version is specified in the build.sbt file so SBT will download the appropriate version of Scala as necessary.
I personally use SDKMAN! to install Java and then SBT.
The key concept to understand is the difference between system-wide installation and project-specific version. System-wide installation ends up somewhere on the PATH like
/usr/local/bin/scala
and can be installed in various ways, personally I recommend coursier one-click install for Scala
curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
Project-specific version is specified by scalaVersion sbt settings which downloads Scala to coursier cache location. To see the Scala version and location used by the particular project try show scalaInstance which
inspect scalaInstance
[info] Task: sbt.internal.inc.ScalaInstance
[info] Description:
[info] Defines the Scala instance to use for compilation, running, and testing.
Scala should be binary compatible within minor version so Spark 3 or any other software built against any Scala 2.12.x version should work with any other Scala 2.12.x version where we have major.minor.patch. Note binary compatibility is not guaranteed for internal compiler APIs, so for example when publishing compiler plugins the best practice is to publish it against full specific Scala version. For example notice how kind-projector compiler plugin is published against full Scala version 2.13.6
https://repo1.maven.org/maven2/org/typelevel/kind-projector_2.13.6/
whilst scala-cats application-level library is published against any Scala 2.13.x version
https://repo1.maven.org/maven2/org/typelevel/cats-core_2.13/
Similarly spark is published against any Scala 2.12.x version
https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/
Regarding system-wide installation one trick I do for quick switching of versions is to put scala-runners on the PATH and then different versions can be launched via --scala-version argument
scala --scala-version 2.12.14
Using coursier or scala-runners you can even switch JDK quickly via -C--jvm for example
scala --scala-version 2.12.14 -C--jvm=11
For a project, there should be no need to download manually a specific version of Scala. sbt either directly or indirectly via an IDE will download all the dependencies behind the scenes for you, so the only thing to specify is sbt setting scalaVersion.
Using Python as analogy to Scala, and Pipenv as anology to sbt, then python_version in Pipfile is similar to scalaVersion in build.sbt. After executing pipenv shell and pipenv install you end up with project-specific shell environment with project specific Python version and dependencies. sbt similarly downloads project specific Scala version and dependencies based of build.sbt although it has no need for lock files or for modifying your shell environment.

How to enable Partial Unification in Spark REPL with Scala 2.11.8?

I have Scala code written in Scala 2.11.12 using the partial-unification compiler option, which I would like to run in a Spark 2.2.2 REPL.
With a Spark version compiled against Scala 2.11.12 (i.e. 2.3+), this is possible in the Spark REPL via :settings -Ypartial-unification, and the code executes.
I want to run this on Spark 2.2.2, which is compiled against Scala 2.11.8.
To do this, I have downloaded the jar with the partial unification compiler plugin (source from: https://github.com/milessabin/si2712fix-plugin), which backports this setting.
I've played around with a Scala 2.11.8 REPL (adding jar to the classpath - seems too rudimentary) and haven't managed to get it working there (before trying to add it to Spark), and am asking if anyone knows how to do this or if adding a compiler setting to a REPL via a JAR is not possible.
Any other advice appreciated!

How to choose the scala version for my spark program?

I am building my first Spark application, developing with IDEA.
In my cluster, the version of Spark is 2.1.0, and the version of Scala is 2.11.8.
http://spark.apache.org/downloads.html tells me:"Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support".
So here is my question:What's the meaning of "Scala 2.10 users should download the Spark source package and build with Scala 2.10 support"? Why not use the version of Scala 2.1.1?
Another question:Which version of Scala can I choose?
First a word about the "why".
The reason this subject evens exists is that scala versions are not (generally speacking) binary compatible, although most of the times, source code is compatible.
So you can take Scala 2.10 source and compile it into 2.11.x or 2.10.x versions. But 2.10.x compiled binaries (JARs) can not be run in a 2.11.x environment.
You can read more on the subject.
Spark Distributions
So, the Spark package, as you mention, is built for Scala 2.11.x runtimes.
That means you can not run a Scala 2.10.x JAR of yours, on a cluster / Spark instance that runs with the spark.apache.org-built distribution of spark.
What would work is :
You compile your JAR for scala 2.11.x and keep the same spark
You recompile Spark for Scala 2.10 and keep your JAR as is
What are your options
Compiling your own JAR for Scala 2.11 instead of 2.10 is usually far easier than compiling Spark in and of itself (lots of dependencies to get right).
Usually, your scala code is built with sbt, and sbt can target a specific scala version, see for example, this thread on SO. It is a matter of specifying :
scalaVersion in ThisBuild := "2.10.0"
You can also use sbt to "cross build", that is, build different JARs for different scala versions.
crossScalaVersions := Seq("2.11.11", "2.12.2")
How to chose a scala version
Well, this is "sort of" opinion based. My recommandation would be : chose the scala version that matches your production Spark cluster.
If your production Spark is 2.3 downloaded from https://spark.apache.org/downloads.html, then as they say, it uses Scala 2.11 and that is what you should use too. Using anything else, in my view, just leaves the door open for various incompatibilities down the road.
Stick with what your production needs.

com.databricks.spark.csv version requirement

Which version of com.databricks.spark.csv is compatible for Spark 1.6.1, and scala 2.10.5?
I can see
com.databricks_spark-csv_2.10-1.5.0.jar
com.databricks_spark-csv_2.11-1.3.0.jar
already available on my machine, and as per my understanding goes, if I have scala version 2.10, then the first option is the one that I have to use. just wanted to re-confirm.
You should use the first jar as you have scala 2.10 on your machine
com.databricks_spark-csv_2.10-1.5.0.jar
As 2.10 means it is meant for scala 2.10

Is there a way to set the Scala version used in an Ammonite script?

Is there a way to set the Scala version used in an Ammonite script?
I've just started using Ammonite and at first blush it seems far superior to the scalas script runner that I've been using up until now. With scalas, however, I can easily set the Scala version used in the script. E.g.,
#!/usr/bin/env scalas
/***
scalaVersion := "2.11.8"
*/
I can't find any reference to any similar declaration in the Ammonite documentation.
When the question was originally posted, there was no way with Ammonite to specify the version of Scala that you wanted to use. The Scala version was hardwired to 2.12.1 at the time.
There is still no way to set the version of Scala that you would like inside an Ammonite script, like you can with scalas. Fortunately, Ammonite is distributed in three different versions. You can download a Scala 2.12 version of Ammonite, if you would like. There are also Scala 2.11, and Scala 2.10 versions available for download.
https://ammonite.io/#OlderScalaVersions