Embedded Kafka & Spark 2.3 version mismatch issue - scala

When I use this dependency:
<dependency>
<groupId>net.manub</groupId>
<artifactId>scalatest-embedded-kafka_2.11</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
With
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.3.0</version>
</dependency>
I run into this error:
Cause: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.reader.SupportsScanUnsafeRow
Trying to figure out which version of 'scalatest-embedded-kafka' will work with Spark 2.3.
Any ideas?

Related

Spring boot embedded kafka throws error BeanCreationException

I have a test case with following configured kafka properties:
#ExtendWith(SpringExtension.class)
#SpringBootTest(classes = {Application.class})
#ActiveProfiles("test")
#EnableConfigurationProperties
#EmbeddedKafka(controlledShutdown = true, topics = {"topic1", "topic2", "topic3"})
class Name {}
Here is the build error i get when running the test case above:
Application run failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'embeddedKafka': Invocation of init method failed; nested exception is java.lang.NoSuchFieldError: DEFAULT_SASL_ENABLED_MECHANISMS
Caused by: java.lang.NoSuchFieldError: DEFAULT_SASL_ENABLED_MECHANISMS
at kafka.server.Defaults$.<clinit>(KafkaConfig.scala:242)
at kafka.server.KafkaConfig$.<clinit>(KafkaConfig.scala:961)
at kafka.server.KafkaConfig.LogDirProp(KafkaConfig.scala)
at org.springframework.kafka.test.EmbeddedKafkaBroker.afterPropertiesSet(EmbeddedKafkaBroker.java:322)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1853)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1790)
... 72 common frames omitted
Here is the POM file with spring kafka & kafka-client dependency which is likely causing the error:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.7.7</version>
<exclusions>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>2.7.7</version>
<scope>test</scope>
</dependency>
Solution:
Please refer to this link:
https://docs.spring.io/spring-kafka/docs/current/reference/html/#update-deps
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>2.7.7</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.8.1</version>
<classifier>test</classifier>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.13</artifactId>
<version>2.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.13</artifactId>
<version>2.8.1</version>
<classifier>test</classifier>
<scope>test</scope>
</dependency>
```
That spring-Kafka version is not compatible with Apache Kafka client 3.0. You need still coming 2.8 and Spring Boot 2.6.
On the other hand you don’t need newer client even if you use newer broker . Although the story is about embedded testing, so I fully doubt you need to worry about any client, unless it is something transitive from Spring for Apache Kafka…

Spring Kafka (2.2.7.RELEASE) with. kafka-clients:2.2.1 IOException during embedded broker startup

Driven by a dependency-check warning we tried to bump the version of org.apache.kafka:kafka-clients to version 2.2.1 in our setup, using spring-kafka:2.2.7.
As a result the tests using EmbeddedKafkaRule fail during broker startup with an IOException claiming "Failed to load /some/path.."
java.io.IOException: Failed to load /Users/[..]/target/embedded-kafka during broker startup
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:152)
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:149)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at kafka.log.LogManager.createAndValidateLogDirs(LogManager.scala:149)
at kafka.log.LogManager.<init>(LogManager.scala:80)
at kafka.log.LogManager$.apply(LogManager.scala:953)
at kafka.server.KafkaServer.startup(KafkaServer.scala:237)
at kafka.utils.TestUtils$.createServer(TestUtils.scala:132)
at kafka.utils.TestUtils.createServer(TestUtils.scala)
at org.springframework.kafka.test.EmbeddedKafkaBroker.afterPropertiesSet(EmbeddedKafkaBroker.java:223)
at org.springframework.kafka.test.rule.EmbeddedKafkaRule.before(EmbeddedKafkaRule.java:109)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:190)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
We made sure, to have no conflicting version of kafka-clients on the classpath and also tried to specify the logs.dir of the EmbeddedKafkaRule to some folder under maven's target folder, "target/embedded-kafka" in the above example.
Both with no success.
Did anybody have the same issue and resolve it?
I just tested it without any problems.
Did you follow the instructions in the documentation about overriding kafka client versions?.
When you use spring-kafka-test (version 2.2.x) with the 2.1.x kafka-clients jar, you need to override certain transitive dependencies, as follows:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>${spring.kafka.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>${spring.kafka.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
</exclusion>
</exclusions>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.1.1</version>
<classifier>test</classifier>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>2.1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>2.1.1</version>
<classifier>test</classifier>
<scope>test</scope>
</dependency>
Note that when switching to scala 2.12 (recommended for 2.1.x and higher), the 2.11 version must be excluded from spring-kafka-test.

Getting exception : java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;) while using data frames

I am receiving "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)" error while using dataframes in scala app and running it using spark. However if I work using only RDD's and not dataframes, no such error comes up with same pom and settings. Also while going through other posts with same error, it is mentioned that scala version has to be 2.10 as spark is not compatible with 2.11 scala, and i am using 2.10 scala version with 2.0.0 spark.
Below is the snip from pom:
<properties>
<spark-assembly>/usr/lib/spark/lib/spark-assembly.jar</spark-assembly>
<encoding>UTF-8</encoding>
<hadoop.version>2.7.1</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<scala.version>2.10.5</scala.version>
<scala.tools.version>2.10</scala.tools.version>
<spark.version>2.0.0</spark.version>
<phoenix.version>4.7.0-HBase-1.1</phoenix.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
Error:
16/10/19 02:57:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.abc.xyz.Compare$.main(Compare.scala:64)
at com.abc.xyz.Compare.main(Compare.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/19 02:57:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;)
16/10/19 02:57:26 INFO spark.SparkContext: Invoking stop() from shutdown hook
Change scala version
<scala.version>2.11.8</scala.version>
<scala.tools.version>2.11</scala.tools.version>
and add
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
I also faced this error and this is purely a version issue.
Your scala version is not compatible or may be you are using the correct version but the intellij libraries has the old version.
Quick fix :
I as using spark 2.2.0 and scala 2.10.4 , which I then changed to scala version 2.11.8.After that do the below:
1) right click on intellij module
2) open module-settings.
3) go to libraries and clear all of them
4) Rebuild
Doing above for me issue is resolved.

NullPointerException in Salat

While making any type of call to Mongo from my Scala application, I am getting this NullPointerException. Can somebody please help.
I am using Mongo 3.0.1 and my Scala version is 2.9.0. Other dependencies are as follows
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>casbah_2.9.1</artifactId>
<type>pom</type>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.11.1</version>
</dependency>
<dependency>
<groupId>com.novus</groupId>
<artifactId>salat-core_2.9.1</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
<version>0.99</version>
</dependency>
Error :
Caused by: java.lang.NullPointerException
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at scala.Option.map(Option.scala:134)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at scala.Option.flatMap(Option.scala:147)
at com.novus.salat.util.GraterPrettyPrinter$class.safeDefault(PrettyPrinters.scala:74)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.safeDefault(PrettyPrinters.scala:108)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:134)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:128)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.apply(PrettyPrinters.scala:128)
at com.novus.salat.util.ToObjectGlitch.<init>(ToObjectGlitch.scala:44)
at com.novus.salat.ConcreteGrater.feedArgsToConstructor(Grater.scala:294)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:263)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:105)
at com.novus.salat.dao.SalatMongoCursorBase$class.next(SalatMongoCursor.scala:47)
at com.novus.salat.dao.SalatMongoCursor.next(SalatMongoCursor.scala:149)
at scala.collection.Iterator$class.foreach(Iterator.scala:652)
at com.novus.salat.dao.SalatMongoCursor.foreach(SalatMongoCursor.scala:149)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:128)
at scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:242)
at com.novus.salat.dao.SalatMongoCursor.toList(SalatMongoCursor.scala:149)
This issue was due to corrupt data in the db. After clearing that it worked.

NoSuchMethodError with SparkContext

I get a NoSuchMethodError when my Spark application executes val sc = new SparkContext("spark://spark01:7077", "Request Executor"). I am compiling my Spark application with version 1.3.1 and Scala version 2.10.4. The Spark cluster is compiled with 1.3.1 as well as the same Scala version.
From looking at the Spark source, getTimeAsSeconds does not exist in Utils.scala until Spark 1.4. Why is it attempting to call a method that does not exist in the version I'm using?
Here are the dependencies from my pom.xml:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.10.4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.4</version>
</dependency>
<dependency>
<groupId>com.twitter</groupId>
<artifactId>util-eval_2.10</artifactId>
<version>6.26.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0-rc1</version>
</dependency>
<dependency>
<groupId>org.jvnet.jaxb2_commons</groupId>
<artifactId>jaxb2-basics-runtime</artifactId>
<version>0.7.0</version>
</dependency>
<!-- Jackson JSON Library -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-jaxb-annotations</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
</dependency>
<dependency>
<groupId>org.codehaus.woodstox</groupId>
<artifactId>woodstox-core-asl</artifactId>
<version>4.1.4</version>
</dependency>
<dependency>
<groupId>uk.org.simonsite</groupId>
<artifactId>log4j-rolling-appender</artifactId>
<version>20131024-2017</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>9.3.1.v20150714</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<version>9.3.1.v20150714</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-webapp</artifactId>
<version>9.3.1.v20150714</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr4-runtime</artifactId>
<version>4.5</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>5.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>5.2.1</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
</exclusion>
<exclusion>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
</exclusion>
</exclusions>
</dependency>
Is something in my dependencies causing me to compile with Spark 1.4?
Stacktrace:
java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J
at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:155)
at com.scala.analytics.RequestExecutor$.executeRequest(RequestExecutor.scala:23)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:816)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1113)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1047)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:119)
at org.eclipse.jetty.server.Server.handle(Server.java:517)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:302)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:242)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:238)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:57)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:213)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:147)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)
It turns out it was a dumb mistake. I am running this application on a different machine, so I copy over my target directory every time I compile. However, I never cleaned my target directory on the remote machine, so I had an old jar with Spark 1.4.0 sitting there, which I used at one point. Every time my application ran, it would look for a Spark jar and use the 1.4.0 jar instead of the 1.3.1 jar that's also in the directory. The solution was simply to delete the old (1.4.0) jar.