Missing import scala.collection.parallel in Scala 2.13 - scala

Parallel collections in Scala 2.12 were importable out-of-the-box like so
import scala.collection.parallel.immutable.ParVector
val pv = new ParVector[Int]
however why in Scala 2.13 package scala.collection.parallel seems to be missing?

Parallel collections have been moved in Scala 2.13 to separate module scala/scala-parallel-collection
This Scala standard module contains the package
scala.collection.parallel, with all of the parallel collections that
used to be part of the Scala standard library.
For Scala 2.13, this module is a separate JAR that can be omitted from
projects that do not use parallel collections.
thus from 2.13 onwards we need the following dependency
libraryDependencies += "org.scala-lang.modules" %% "scala-parallel-collections" % "1.0.0"
and to enable .par extension method import
import scala.collection.parallel.CollectionConverters._
Corresponding scaladoc is also no longer available from 2.13 API docs but instead is published at javadoc.io/doc/org.scala-lang.modules/scala-parallel-collections_2.13.

Related

NullPointerException on XML.loadFile()

I am trying to load an xml file using scala-xml_2.12-1.0.6.jar but it gives me NullPointerEexception while loading
Following is my line of code to load xml
import scala.xml.XML
val xml = XML.loadFile("sample.xml")
I have decompiled this jar and is method is present in that jar but for some reasons it is unable to find it in code.
I have Scala 2.13.1 on my system but for this project I am using scala 2.12.1 and it is mentioned in mu built.sbt
scalaVersion := "2.12.1"
I have following dependency in my built.sbt for this xml package
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6"
If I copy and paste the same code to Scala interactive shell( scala 2.13.1) I get following error
import scala.xml.XML
^
error: object xml is not a member of package scala
did you mean Nil?
Can anyone please identify what am i doing wrong?
Thanks in advance.
I'm not sure how you are loading up the Scala REPL, but as mentioned in "How to use third party libraries with Scala REPL?", you should be launching the the REPL from SBT with sbt console. Your .sbt files will also need to be in scope.
The Scala REPL independently does not deal with .sbt files. They are 2 different tools.
Alternatively you could also install Ammonite-REPL which supports Magic Imports. You will be able to import Maven dependencies with import $ivy.
Scala 2.12 comes with scala-xml and you can remove that dependency as long as you run REPL with sbt console and not your native Scala REPL which is already # Scala 2.13. Otherwise you can also switch to Scala 2.13 for your SBT project.

Trying to use .par in Scala 2.13 gives me “Error: value par is not a member of” [duplicate]

Parallel collections in Scala 2.12 were importable out-of-the-box like so
import scala.collection.parallel.immutable.ParVector
val pv = new ParVector[Int]
however why in Scala 2.13 package scala.collection.parallel seems to be missing?
Parallel collections have been moved in Scala 2.13 to separate module scala/scala-parallel-collection
This Scala standard module contains the package
scala.collection.parallel, with all of the parallel collections that
used to be part of the Scala standard library.
For Scala 2.13, this module is a separate JAR that can be omitted from
projects that do not use parallel collections.
thus from 2.13 onwards we need the following dependency
libraryDependencies += "org.scala-lang.modules" %% "scala-parallel-collections" % "1.0.0"
and to enable .par extension method import
import scala.collection.parallel.CollectionConverters._
Corresponding scaladoc is also no longer available from 2.13 API docs but instead is published at javadoc.io/doc/org.scala-lang.modules/scala-parallel-collections_2.13.

How do you properly set up Scala Spark libraryDependencies with the correct version of Scala?

I'm new to Scala Spark and I'm trying to create an example project using Intellij. During Project creation I choose Scala and Sbt with Scala version 2.12 but When I tried adding spark-streaming version 2.3.2 if kept erroring out so I Google'd around and on Apache's website I found the sbt config shown below and I'm still getting the same error.
Error: Could not find or load main class SparkStreamingExample
Caused by: java.lang.ClassNotFoundException: SparkStreamingExample
How can it be determined which version of Scala works with which version of Spark Dependencies?
name := "SparkStreamExample"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming_2.11" % "2.3.2"
)
My Object class is very basic doesn't have much to it...
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
object SparkStreamingExample extends App {
println("SPARK Streaming Example")
}
You can see the version of Scala that is supported by Spark in the Spark documentation.
As of this writing, the documentation says:
Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.3.2 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
Notice that only Scala 2.11.x is supported.

Confused about getting Scala to run in IntelliJ? Import errors

I am trying to run a simple scala program in IntelliJ.
My build.sbt looks like this:
name := "UseThis"
version := "0.1"
scalaVersion := "2.12.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
And my import code looks like this:
package WorkWork.FriendsByAge
import com.sun.glass.ui.Window.Level
import com.sun.istack.internal.logging.Logger
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.log4j._
I don't get why the import fails. It tells me the dependency failed to load or wasn't found, but I put the line in the build.sbt as required. Is there some other step I need to have done? I've installed spark. Is it the version note at the end of the build line? I don't even know how to check what version I have of spark.
I'm trying to teach myself Scala (not a noob though, I know Python, R, various flavors of SQL, C#) but my word even setting it up is nigh on impossible, and apparently getting it to even run is too. Any ideas?
Take a look at this page here: Maven Central (Apache Spark core)
Unless you have set up some other repositories, the dependencies that are going to be loaded by sbt usually come from there.
There is a version column with numbers like 2.2.1, then there comes a scala column with numbers like 2.11, 2.10. In order for spark and scala to work together, you have to pick a valid combination from this table.
As of 28.Feb 2018, there are no versions of Spark that work with scala 2.12.4. The latest version of scala for which 1.2.0 works is 2.11. So, you will probably want to set scala version to 2.11.
Also note that the %% syntax in your SBT in
"org.apache.spark" %% "spark-core" % "1.2.0"
will automatically append the suffix _2.12 to the artifact-id-part. Since there is no spark-core_2.12, it cannot resolve the dependency, and you can't import anything in your code.
By the way: there was a big difference between spark 1.2 and spark 1.3, and then there was again a big difference between 1.x and 2.x. Does it really have to be 1.2?
Its because Spark 1.2 is not available for Scala 2.12
https://mvnrepository.com/artifact/org.apache.spark/spark-core

Why does SBT's Scala (2.10) not include Akka?

I downloaded Scala 2.10.2, unpacked it and run Scala command, I can successfully import akka._.
In another experiment, I create an sbt project with the following line in build.sbt:
scalaVersion := "2.10.2"
A source file import akka._, and sbt complains "akka not found".
What is the difference betwen SBT's Scala 2.10.2 and the one on Scala website? And why does the official Scala already include Akka library, but SBT's Scala does not?
Akka is a part of the Scala Distribution (the zip you downloaded) but not the Scala Standard Library — which is what you get in SBT.