Spark and Scala upgrade on CDH - scala

I currently have a CDH with Spark 1.6.0 and Scala 2.10.5. I would like to upgrade the Spark version to 2.0.0 and Scala version to 2.11.x and make these as defaults.
I am currently trying this on a CDH Quickstart VM but would like to extend this to a Spark cluster with CDH distribution.
Could someone help on how to go about these two upgrades?
Thank you.

Related

Apache Spark 3 and backward compatibility?

We have several Spark applications running on production developed using Spark 2.4.1 (Scala 2.11.12).
For couple of our new Spark jobs,we are considering utilizing features of DeltaLake.For this we need to use Spark 2.4.2 (or higher).
My questions are:
If we upgrade our Spark cluster to 3.0.0, can our 2.4.1 applications still run on the new cluster (without recompile)?
If we need to recompile our previous Spark jobs with Spark 3, are they source compatible or do they need any migration?
There are some breaking changes in Spark 3.0.0, including source incompatible change and binary incompatible changes. See https://spark.apache.org/releases/spark-release-3-0-0.html. And there are also some source and binary incompatible changes between Scala 2.11 and 2.12, so you may also need to update codes because of Scala version change.
However, only do Delta Lake 0.7.0 and above require Spark 3.0.0. If upgrading to Spark 3.0.0 requires a lot of work, you can use Delta Lake 0.6.x or below. You just need to upgrade Spark to 2.4.2 or above in 2.4.x line. They should be source and binary compatible.
You can cross compile projects Spark 2.4 projects with Scala 2.11 and Scala 2.12. The Scala 2.12 JARs should generally work for Spark 3 applications. There are edge cases when using a Spark 2.4/Scala 2.12 JAR won't work properly on a Spark 3 cluster.
It's best to make a clean migration to Spark 3/Scala 2.12 and cut the cord with Spark 2/Scala 2.11.
Upgrading can be a big pain, especially if your project has a lot of dependencies. For example, suppose your project depends on spark-google-spreadsheets, a project that's not built with Scala 2.12. With this dependency, you won't be able to easily upgrade your project to Scala 2.12. You'll need to either compile spark-google-spreadsheets with Scala 2.12 yourself or drop the dependency. See here for more details on how to migrate to Spark 3.

Is spark 1.6.3 compatible with Kafka 1.0.0?

I have spark 1.6.3, Scala 2.10.5 and Kafka 1.0.0, is it possible to use Kafka 1.0.0 with Scala 2.10.5?
No. Kafka 1.0.0 requires Scala 2.11 or 2.12.
https://kafka.apache.org/downloads#1.0.0
Spark-Streaming v1.6.3 clients can communicate with Kafka 1.0 brokers, yes.
I'm not sure how you got an old version of Scala because the Spark 1.6.3 downloads are built for Scala 2.11

Version Compatibility issues with Scala, Spark, Spark NLP

I am new to 'Spark NLP' and I got stuck in version compatibility issues only. That may seems to be silly but still I request you to help me in this:
‘Spark NLP’ is built on top of Apache Spark 2.4.0 and such is the only supported release (mentioned on https://nlp.johnsnowlabs.com/docs/en/quickstart)
Spark 2.4.3 uses Scala 2.12 (mentioned on https://spark.apache.org/docs/latest/)
But maven repo repository just have spark_nlp libararies for scala 2.11 only not after that (see https://repo1.maven.org/maven2/com/johnsnowlabs/nlp/, only spark-nlp_2.11 is present and not spark-nlp_2.12)
Please help

Connector Version for Spark - Cassandra 3.x

Can someone please suggest a version of Cassandra 3.x which works with Spark 1.6, Scala 2.10.5 WHICH WORKS!!!!
Below are the version of jars I am looking for the versions of the below jar files
Cassandra Core
Cassandra Spark Connector
Thanks,
Sai
Visit the below link for check version compatibility.
Correct version of connector is 1.6 for cassandra 3.x , spark -1.6 and scala -2.10.5
Check version as per below image.
https://github.com/datastax/spark-cassandra-connector

Spark Kafka - Issue while running from Eclipse IDE

I am experimenting with Spark Kafka integration. And I want to test the code from my eclipse IDE. However, I got below error:
java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at kafka.utils.Pool.<init>(Pool.scala:28)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(FetchRequestAndResponseStats.scala:60)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(FetchRequestAndResponseStats.scala)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:52)
at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:345)
at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:342)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:342)
at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:125)
at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:112)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:403)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:532)
at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.executeStreamingCalculations(SparkTelemetryReceiverFromKafkaStream.java:248)
at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.main(SparkTelemetryReceiverFromKafkaStream.java:84)
UPDATE:
The versions that I am using are:
scala - 2.11
spark-streaming-kafka- 1.4.1
spark - 1.4.1
Can any one resolve the issue? Thanks in advance.
You have the wrong version of Scala. You need 2.10.x per
https://spark.apache.org/docs/1.4.1/
"For the Scala API, Spark 1.4.1 uses Scala 2.10."
Might be late to help OP, but when using kafka streaming with spark, you need to make sure that you use the right jar file.
For example, in my case, I have scala 2.11 (the minimum required for spark 2.0 which im using), and given that kafka spark requires the version 2.0.0 I have to use the artifact spark-streaming-kafka-0-8-assembly_2.11-2.0.0-preview.jar
Notice my scala version and the artifact version can be seen at 2.11-2.0.0
Hope this helps (someone)
Hope that helps.