Im working on several Scala projects and liberties
Some of them work on scala 2.10.x version and some of them work on the 2.11.X version .
My default $scala_home version is 2.11 so if a build something by default it will be built in the 2.11 version? (thats true??)
My main issue is with Apache Spark and Kafka that are working fine with the 2.10.x version .
How are handling multi version dependency on one machine.
thanks,
miki
I would not install a global Scala on the system, neither do I have a $SCALA_HOME. I think most people would recommend against it.
If you use a build tool like sbt, that will take care of applying the correct Scala version as required by your build definition. In sbt, you have a file build.sbt that contains an assignment scalaVersion := "2.10.4" or scalaVersion := "2.11.5" depending on which version you want to use. It keeps all the different Scala versions neatly separated in a cache directory and won't confuse them.
Related
I have a weird problem with Sbt. I have a Scala zio version set to 1.0.12 in build.sbt:
val zio = "1.0.12"
But when I ran application with sbt it downloaded zio in version 2.x (screen):
I have no idea why. I removed .ivy2 and .sbt directories from user directory. I restarted intellij many times, invalided cached. Even with clear project it always downloads version 2.0.0. Whole code is inspected with this version.
Other zio related lib's versions I use:
val scalaVersion = "2.13.8"
val zio = "1.0.12"
val zioInteropCats = "3.2.9.0"
val zioInteropLog = "1.0.1"
Do you have any ideas why it works like this? I do not need to use ZIO in newest version.
I see in your comment you already solved this, but here's how you'd solve it in general:
This kind of thing is caused when two of your dependencies require different versions of a library. Your explicitly-set version is being "evicted" in favor of a higher version that something else requires
You can find this info by running sbt evicted. If your sbt version is recent (as of 2022), it should be able to use the versionScheme for scala libraries (if they are versions published after the feature was added) in order to alert you with an error when an eviction is likely to be incompatible.
The solution as you've found is to locate the dependency which is bringing in the incompatible version and then resolve the conflict by changing the version of either that dependency or the others
I am porting a small legacy library from scala 2.12 to scala 2.13. sbt version is 1.3.3. The project is flat and relatively simple. scalaVersion declared in the project is 2.13.1.
I am executing clean and compile tasks, and then publish to both local ivy and to the artifactory.
The process seemingly goes fine and creates the artifact with the _2.13 suffix. When this binary gets executed against scala 2.13 runtime, it fails with MethodNotFound exception. Further introspection shows that the artifact was compiled for 2.12 but not for 2.13.
Does anybody have an idea why a different compiler version was used by sbt, and how to fix this problem?
Just had similar issue, sbt was compiling my project to the wrong Scala version, found this question without answers in google.
So, my problem was actually very simple. Turns out you need to start sbt in a project root directory (where build.sbt is located). I was running it from a directory where all my .scala files were located, so it haven't parsed build.sbt and compiled project using default Scala version (2.12 in my case).
I am building my first Spark application, developing with IDEA.
In my cluster, the version of Spark is 2.1.0, and the version of Scala is 2.11.8.
http://spark.apache.org/downloads.html tells me:"Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support".
So here is my question:What's the meaning of "Scala 2.10 users should download the Spark source package and build with Scala 2.10 support"? Why not use the version of Scala 2.1.1?
Another question:Which version of Scala can I choose?
First a word about the "why".
The reason this subject evens exists is that scala versions are not (generally speacking) binary compatible, although most of the times, source code is compatible.
So you can take Scala 2.10 source and compile it into 2.11.x or 2.10.x versions. But 2.10.x compiled binaries (JARs) can not be run in a 2.11.x environment.
You can read more on the subject.
Spark Distributions
So, the Spark package, as you mention, is built for Scala 2.11.x runtimes.
That means you can not run a Scala 2.10.x JAR of yours, on a cluster / Spark instance that runs with the spark.apache.org-built distribution of spark.
What would work is :
You compile your JAR for scala 2.11.x and keep the same spark
You recompile Spark for Scala 2.10 and keep your JAR as is
What are your options
Compiling your own JAR for Scala 2.11 instead of 2.10 is usually far easier than compiling Spark in and of itself (lots of dependencies to get right).
Usually, your scala code is built with sbt, and sbt can target a specific scala version, see for example, this thread on SO. It is a matter of specifying :
scalaVersion in ThisBuild := "2.10.0"
You can also use sbt to "cross build", that is, build different JARs for different scala versions.
crossScalaVersions := Seq("2.11.11", "2.12.2")
How to chose a scala version
Well, this is "sort of" opinion based. My recommandation would be : chose the scala version that matches your production Spark cluster.
If your production Spark is 2.3 downloaded from https://spark.apache.org/downloads.html, then as they say, it uses Scala 2.11 and that is what you should use too. Using anything else, in my view, just leaves the door open for various incompatibilities down the road.
Stick with what your production needs.
Is there away to upgrade the installed Scala version via sbt / other command line tool?
I'm sure there is a way, but I couldn't find any after a quick search, am I missing anything?
Each SBT project specifies the version of Scala to use to compile and run its code. It defaults to being the version of Scala that SBT uses internally, but is always overridable.
E.g.
scalaVersion := "2.10.0"
As Connor Doyle mentioned, if your OS has a package system that includes Scala (some Linux distros I know of do) and you are, for some reason obligated to use that, you are pretty much at their mercy to provide a new version on a timely basis. The Scala Web Site (downloads here) provides a variety of installers and tarballs / Zip archives for every release they've made.
Mac OS X users can use HomeBrew to get up-to-date SBT and Scala.
To set the project version temporarily from the command line:
++ 2.10.4
To set the project version permanently from the command line:
set scalaVersion := "2.10.4"
session save
I just executed one command line"homebrew remove scala;homebrew install scala" to update to the latest version. Isn't this enought?
I also found this link (http://wkmacura.tumblr.com/post/11577309978/installing-specific-versions-with-homebrew) and hope it works for you.
How can I change Scala version in a sbt project?
I would like SBT to check whether the system's Scala version is correct and if it is not the case then download it.
xsbt (0.10+, including the latest 0.13.7)
Change scalaVersion in build.sbt to whatever Scala version your project should be using - see .sbt build definition.
scalaVersion := "2.10.1"
sbt:
As mentioned in RunningSBT, you can:
You can temporarily switch to another version of Scala using ++<version>.
This version does not have to be listed in your build.scala.versions property, but it does have to be in a repository or be a local Scala version you have defined.
But the CrossBuild page is more suited for what you want, as it shows in action how to change the build.scala.versions property.
So you should be able to
set build.scala.versions 2.7.7
reload
set build.scala.versions 2.8.0.RC2
reload
and each time trigger a compilation with a different Scala version.