I have spark 1.6.3, Scala 2.10.5 and Kafka 1.0.0, is it possible to use Kafka 1.0.0 with Scala 2.10.5?
No. Kafka 1.0.0 requires Scala 2.11 or 2.12.
https://kafka.apache.org/downloads#1.0.0
Spark-Streaming v1.6.3 clients can communicate with Kafka 1.0 brokers, yes.
I'm not sure how you got an old version of Scala because the Spark 1.6.3 downloads are built for Scala 2.11
Related
I'm getting started with Apache Flink. I noticed on the Apache Flink download page, it says "Apache Flink 1.14.3 for Scala 2.11 (asc, sha512)" as the name of the installation file.
Can you confirm there is no Apache Flink for Scala 3.x? I want to make sure I download the right version of Scala
Flink up to 1.14 is available in both a Scala 2.11 and Scala 2.12 binary, which can be downloaded from https://flink.apache.org/downloads.html. This means that you can use Flink's Scala API with either Scala 2.11 or Scala 2.12.
In Flink 1.15 (the next version), support for Scala 2.11 is dropped. Next to that, all Java APIs are independent from Scala. You can find more information at https://issues.apache.org/jira/browse/FLINK-23986
This means that if you would like to use Scala 3.0 in combination with Flink 1.15, that's possible. You can find some examples at https://github.com/sjwiesman/flink-scala-3
i got this error. I'm not sure why this is the case because there is a coalesce method in org.apache.spark.rdd.RDD.
Any ideas?
Am I running a incompatible version of Spark and org.apache.spark.rdd.RDD?
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.coalesce$default$3(IZ)Lscala/math/Ordering;
It was because some part of your code or project dependencies called old version(spark version before 2.0.0) spark API 'coalesce' while in new version spark this API has been removed and replaced by 'repartition'.
To fix this problem, you could either downgrade your spark run environment to below version 2.0.0, or you can upgrade your SDK spark version to above 2.0.0 and upgrade project dependencies version to be compatible with spark 2.0.0 or above.
For more details please see this thread:
https://github.com/twitter/algebird/issues/549
https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31
As I suspected, this is a library compatibility issue. Everything works (no code change) after downgrading Spark alone.
Before:
scala 2.11.8
spark 2.0.1
Java 1.8.0_92
After
scala 2.11.8
spark 1.6.2
Java 1.8.0_92
OS: OSX 10.11.6
I currently have a CDH with Spark 1.6.0 and Scala 2.10.5. I would like to upgrade the Spark version to 2.0.0 and Scala version to 2.11.x and make these as defaults.
I am currently trying this on a CDH Quickstart VM but would like to extend this to a Spark cluster with CDH distribution.
Could someone help on how to go about these two upgrades?
Thank you.
Apache Kafka download page says that it's available for Scala 2.10 and 2.11.
If I have installed Kafka using "brew install kafka" (on MacOS X) then which build is installed - for 2.10 or for 2.11?
I'm planning to use it together with Spark (currently 1.6.1, using Scala 2.10 if we don't want to build it ourselves) and I want a uniform version of Scala for all, within one project.
/usr/local/Cellar/kafka/0.9.0.0/libexec/core/build/libs/kafka_2.10-0.9.0.0.jar
2.10 stands for Scala version.
Can someone please suggest a version of Cassandra 3.x which works with Spark 1.6, Scala 2.10.5 WHICH WORKS!!!!
Below are the version of jars I am looking for the versions of the below jar files
Cassandra Core
Cassandra Spark Connector
Thanks,
Sai
Visit the below link for check version compatibility.
Correct version of connector is 1.6 for cassandra 3.x , spark -1.6 and scala -2.10.5
Check version as per below image.
https://github.com/datastax/spark-cassandra-connector