spark-cassandra-connector for Spark 1.4 and Cassandra 3.0 - scala

Spark-Cassandra experts: Will Apache Spark 1.4 work with Apache Cassandra 3.0 in Datastax installations?. We are considering several options for migrating DSE 4.8 (Spark 1.4 and Cassandra 2.1) to DSE 5.0 (Spark 1.6 and Cassandra 3.0). One option is to update Cassandra Cluster to DSE 5.0 and leave Spark cluster on DSE 4.8. This means we have to make Apache Spark 1.4 work with Apache Cassandra 3.0. We use https://github.com/datastax/spark-cassandra-connector versions 1.4 (DSE 4.8) and 1.6(DSE 5.0). Has someone tried using Spark 1.4 (DSE 4.8) with Cassandra 3.0 (DSE 5.0) ?.

As I can see from the Maven Central, Spark Cassandra Connector 1.4.5 did use the version 2.1.7 of the Java driver. According the compatibility matrix in official documentation, the driver 2.1.x won't work with Cassandra 3.0... You can of course test it, but I doubt that it will work - driver is usually backward compatible, but not forward compatible...
I recommend to perform migration to DSE 5.0, and then move to 5.1 fast enough, as 5.0 could be EOL soon.
P.S. If you have more questions, I recommend to join the DataStax Academy Slack - there is a separate channel about spark cassandra connector there.

Related

How to do upgrade/instructions Apache Atlas from version 2.0 to version 2.2?

I need help with upgrade instructions/process for Apache Atlas from version 2.0 to 2.2.
Can't find instructions on how to upgrade Apache Atlas version 2.o to 2.2.

How to repackage Apache beam bundle to update with latest compaible mongo db java driver

As Apache Beam 2.38 having incompatible mongo db java driver. I want to add extra packages recommened by Mongo DB documents like below
mongodb-driver-sync 4.0
mongodb-driver-legacy
https://www.mongodb.com/community/forums/t/conversion-from-mongodb-driver-legacy-to-mongodb-driver-sync/125411
So i wanted to build Apache Beam 2.38 which removed the existing Mongo Java Driver which is not compatible. For this i have raised a issues on Apach Beam Jira as well,
https://issues.apache.org/jira/browse/BEAM-14420
Mongo DB Upgrade to 5.0 having compatibitlity issue with Apache Beam included mongo-java-driver.

Jhipster and Kafka 2.x well supported?

Jhipster 5.7 generates a kafka.yml file referencing kafka 1.0.0.
Is Kafka 2.x well supported (and already tested) by jhipster. If yes, is the move to kafka 2.x planned ?
Thanks

Spark job server for spark 1.6.0

Is there any specific Spark Job Server version matching with Spark 1.6.0 ?
As per the version information in https://github.com/spark-jobserver/spark-jobserver, I see SJS is available only for 1.6.1 not for 1.6.0.
Our CloudEra hosted Spark is running on 1.6.0
I deployed SJS by configuring the spark home to 1.6.1. When I submitted jobs, I see job ids are getting generated but I can't see the job result.
Any inputs?
No, there is no SJS version tied to spark 1.6.0. But it should be easy for you to compile against 1.6.0. May be you could modify this https://github.com/spark-jobserver/spark-jobserver/blob/master/project/Versions.scala#L10 and try.

Upgrading to Spark 1.5.0 on Ambari

I am using the Hortonworks release which uses Spark 1.3.
When I ran my app using Spark API 1.5 it failed, but switching to 1.3 worked.
There are new features in 1.5 that I want to use on the server but apparently this is not possible without upgrading it.
How do I upgrade Spark from 1.3 to 1.5 on Ambari?