I am new to spark/scala development. I am using maven to build my project and IDE is intelliJ. I am trying to query a hive table and then iterate over the resulting dataframe(using foreach). Here's my code:
try{
val DF_1 = hiveContext.sql("select distinct(address) from
test_table where trim(address)!=''")
println("number of rows: "+DF_1.count)
DF_1.foreach(x => {
val y =hiveContext.sql("select place from test_table where address='"+x(0).toString+"'")
if(y.count > 1){
println("Multiple place values for address: "+x(0).toString)
y.foreach(r => println(r))
println("*************")
}
})}
catch {case e: Exception => e.printStackTrace()}
With each iteration, I am Querying the same table to get another column, trying to see if there are multiple values of places for each address in test_table. I have no compilation errors and the application builds successfully. But, when I run the above code, I get the following error:
java.lang.NoClassDefFoundError: Could not initialize class xxxxxxxx
the application launches successfully, prints the count of rows in DF_1 and then fails with the above error at the foreach loop. I did a jar xvf on my jar and can see the main class - driver.class:
com/.../driver$$anonfun$1$$anonfun$apply$1.class
com/.../driver$$anonfun$1.class
com/.../driver$$anonfun$2.class
com/.../driver$$anonfun$3.class
com/.../driver$$anonfun$4.class
com/.../driver$$anonfun$5.class
com/.../driver$$anonfun$main$1$$anonfun$apply$1.class
com/.../driver$$anonfun$main$1$$anonfun$apply$2.class
com/.../driver$$anonfun$main$1$$anonfun$apply$3.class
com/.../driver$$anonfun$main$1.class
com/.../driver$$anonfun$main$10$$anonfun$apply$9.class
com/.../driver$$anonfun$main$10.class
com/.../driver$$anonfun$main$11.class
com/.../driver$$anonfun$main$12.class
com/.../driver$$anonfun$main$13.class
com/.../driver$$anonfun$main$14.class
com/.../driver$$anonfun$main$15.class
com/.../driver$$anonfun$main$16.class
com/.../driver$$anonfun$main$17.class
com/.../driver$$anonfun$main$18.class
com/.../driver$$anonfun$main$19.class
com/.../driver$$anonfun$main$2$$anonfun$apply$4.class
com/.../driver$$anonfun$main$2$$anonfun$apply$5.class
com/.../driver$$anonfun$main$2$$anonfun$apply$6.class
com/.../driver$$anonfun$main$2.class
com/.../driver$$anonfun$main$20.class
com/.../driver$$anonfun$main$21.class
com/.../driver$$anonfun$main$22.class
com/.../driver$$anonfun$main$23.class
com/.../driver$$anonfun$main$3$$anonfun$apply$7.class
com/.../driver$$anonfun$main$3$$anonfun$apply$8.class
com/.../driver$$anonfun$main$3.class
com/.../driver$$anonfun$main$4$$anonfun$apply$9.class
com/.../driver$$anonfun$main$4.class
com/.../driver$$anonfun$main$5.class
com/.../driver$$anonfun$main$6$$anonfun$apply$1.class
com/.../driver$$anonfun$main$6$$anonfun$apply$2.class
com/.../driver$$anonfun$main$6$$anonfun$apply$3.class
com/.../driver$$anonfun$main$6$$anonfun$apply$4.class
com/.../driver$$anonfun$main$6$$anonfun$apply$5.class
com/.../driver$$anonfun$main$6.class
com/.../driver$$anonfun$main$7$$anonfun$apply$1.class
com/.../driver$$anonfun$main$7$$anonfun$apply$2.class
com/.../driver$$anonfun$main$7$$anonfun$apply$3.class
com/.../driver$$anonfun$main$7$$anonfun$apply$4.class
com/.../driver$$anonfun$main$7$$anonfun$apply$5.class
com/.../driver$$anonfun$main$7$$anonfun$apply$6.class
com/.../driver$$anonfun$main$7$$anonfun$apply$7.class
com/.../driver$$anonfun$main$7$$anonfun$apply$8.class
com/.../driver$$anonfun$main$7.class
com/.../driver$$anonfun$main$8$$anonfun$apply$10.class
com/.../driver$$anonfun$main$8$$anonfun$apply$4.class
com/.../driver$$anonfun$main$8$$anonfun$apply$5.class
com/.../driver$$anonfun$main$8$$anonfun$apply$6.class
com/.../driver$$anonfun$main$8$$anonfun$apply$7.class
com/.../driver$$anonfun$main$8$$anonfun$apply$8.class
com/.../driver$$anonfun$main$8$$anonfun$apply$9.class
com/.../driver$$anonfun$main$8.class
com/.../driver$$anonfun$main$9$$anonfun$apply$11.class
com/.../driver$$anonfun$main$9$$anonfun$apply$7.class
com/.../driver$$anonfun$main$9$$anonfun$apply$8.class
com/.../driver$$anonfun$main$9$$anonfun$apply$9.class
com/.../driver$$anonfun$main$9.class
com/.../driver$.class
com/.../driver.class
I am not facing the error when I launch the job in local mode instead of yarn. What is causing the issue and how can it be corrected?
Any help would be appreciated, Thank you.
Looks like your your jar or some dependencies aren't distributed between worker nodes. In local mode it works because you have the jars in the place. In yarn mode you need to build a fat-jar with all dependencies include hive and spark libraries in it.
Related
I just upgraded our Kafka Stream application from 2.5.1 to 2.6.2. It used to work, now it doesn't.
Here is the troublesome topology (I have omitted the irrelevant Serdes):
val builder = new StreamsBuilder()
val contractEventStream: KStream[TariffId, ContractEvent] =
builder.stream[String, ContractUpsertAvro](settings.contractsTopicName)
.flatMap { (_, contractAvro) =>
ContractEvent.from(contractAvro)
.map(contractEvent => (contractEvent.tariffId, contractEvent))
}
val tariffsTable: KTable[TariffId, Tariff] =
builder.stream[String, TariffUpdateEventAvro](settings.tariffTopicName)
.flatMapValues(Tariff.fromAvro(_))
.selectKey((_, tariff) => tariff.tariffId)
.toTable(Materialized.`with`(tariffIdSerde, tariffSerde)) // Materialized.as also throws the same IllegalStateExceptions
contractEventStream
.join(tariffsTable)(JourneyStep.from(_, _).asInstanceOf[ContractCreated])(Joined.`with`(tariffIdSerde, contractEventSerde, tariffSerde))
.selectKey((_, contractUpdated) => contractUpdated.accountId)
.foreach((_, journeyStep) => println(journeyStep))
The join gives the following exception:
java.lang.IllegalStateException: Tried to lookup lag for unknown task 3_0
at org.apache.kafka.streams.processor.internals.assignment.ClientState.lagFor(ClientState.java:306)
at java.util.Comparator.lambda$comparingLong$6043328a$1(Comparator.java:511)
at java.util.Comparator.lambda$thenComparing$36697e65$1(Comparator.java:216)
at java.util.TreeMap.compare(TreeMap.java:1295)
at java.util.TreeMap.put(TreeMap.java:538)
at java.util.TreeSet.add(TreeSet.java:255)
at java.util.AbstractCollection.addAll(AbstractCollection.java:344)
at java.util.TreeSet.addAll(TreeSet.java:312)
at org.apache.kafka.streams.processor.internals.StreamsPartitionAssignor.getPreviousTasksByLag(StreamsPartitionAssignor.java:1275)
at org.apache.kafka.streams.processor.internals.StreamsPartitionAssignor.assignTasksToThreads(StreamsPartitionAssignor.java:1189)
at org.apache.kafka.streams.processor.internals.StreamsPartitionAssignor.computeNewAssignment(StreamsPartitionAssignor.java:940)
at org.apache.kafka.streams.processor.internals.StreamsPartitionAssignor.assign(StreamsPartitionAssignor.java:399)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.performAssignment(ConsumerCoordinator.java:589)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.onJoinLeader(AbstractCoordinator.java:684)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.access$1000(AbstractCoordinator.java:111)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator$JoinGroupResponseHandler.handle(AbstractCoordinator.java:597)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator$JoinGroupResponseHandler.handle(AbstractCoordinator.java:560)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator$CoordinatorResponseHandler.onSuccess(AbstractCoordinator.java:1160)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator$CoordinatorResponseHandler.onSuccess(AbstractCoordinator.java:1135)
at org.apache.kafka.clients.consumer.internals.RequestFuture$1.onSuccess(RequestFuture.java:206)
at org.apache.kafka.clients.consumer.internals.RequestFuture.fireSuccess(RequestFuture.java:169)
at org.apache.kafka.clients.consumer.internals.RequestFuture.complete(RequestFuture.java:129)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient$RequestFutureCompletionHandler.fireCompletion(ConsumerNetworkClient.java:602)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.firePendingCompletedRequests(ConsumerNetworkClient.java:412)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:297)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:236)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1296)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1237)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1210)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:767)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:624)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:551)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:510)
I can't see what I am doing wrong. The code above works with Kafka 2.5.1. Anyone has any idea what is going on?
The problem is caused by the Kafka Streams cache, which it keeps on disk. This cache is specific to Kafka-version and to the Kafka Streams topology you use (ie. a change in your topology could also lead to this error).
The cache is usually found in /tmp or elsewhere if you passed in the "state.dir" property to Kafka Streams. Clear the directory with the cache and you should be able to cleanly start again.
I was trying to use avro support for spark-2.4.0, as mentioned here. However, since I'm behind a proxy, I wasn't able to download through maven.
Instead, I downloaded the jar to local, launched spark-shell and then inside the spark-shell, typed:
:require /path/to/newly/downloaded/jar.
This throws the error:
The path '/Users/hay531/Desktop/jars/spark-avro_2.11-2.4.0.jar' cannot be loaded, because existing classpath entries conflict.
So, I printed out all the jars in the classpath using:
import java.lang.ClassLoader
val cl = ClassLoader.getSystemClassLoader
cl.asInstanceOf[java.net.URLClassLoader].getURLs.foreach(println)
And below is the list of jars I got:
file:/Users/hay531/Downloads/spark-2.4.0/conf/
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/breeze-macros_2.11-0.13.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/stream-2.7.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/machinist_2.11-0.6.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-media-jaxb-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-unsafe_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-mapreduce-client-common-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/minlog-1.3.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-client-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/xz-1.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/pyrolite-4.13.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jsr305-1.3.9.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-yarn-common-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/curator-recipes-2.6.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-configuration-1.6.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-beanutils-1.7.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/RoaringBitmap-0.5.11.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/httpcore-4.4.10.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-streaming_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/objenesis-2.5.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-httpclient-3.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/stax-api-1.0-2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hk2-api-2.4.0-b34.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/apacheds-i18n-2.0.0-M15.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/janino-3.0.9.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jtransforms-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/activation-1.1.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/orc-mapreduce-1.5.2-nohive.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jaxb-api-2.2.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/oro-2.0.8.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-jaxrs-1.9.13.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/xercesImpl-2.9.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/scala-xml_2.11-1.0.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/scala-library-2.11.12.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-guava-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/opencsv-2.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-graphx_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-compiler-3.0.9.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/json4s-ast_2.11-3.5.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/javax.servlet-api-3.1.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-hdfs-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/httpclient-4.5.6.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-core-2.6.7.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/log4j-1.2.17.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-yarn-server-common-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/arpack_combined_all-0.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-jackson-1.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-cli-1.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/scala-parser-combinators_2.11-1.1.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/validation-api-1.1.0.Final.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/json4s-core_2.11-3.5.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-digester-1.8.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-annotations-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/protobuf-java-2.5.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/javassist-3.18.1-GA.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-auth-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/javax.annotation-api-1.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/xmlenc-0.52.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-xc-1.9.13.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jul-to-slf4j-1.7.16.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jetty-util-6.1.26.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-module-scala_2.11-2.6.7.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/arrow-vector-0.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/orc-shims-1.5.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-sketch_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-yarn-client-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-container-servlet-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/metrics-graphite-3.1.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/paranamer-2.8.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/orc-core-1.5.2-nohive.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/chill_2.11-0.9.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-mllib_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-format-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/htrace-core-3.0.4.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-compress-1.8.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/macro-compat_2.11-1.1.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/chill-java-0.9.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/avro-ipc-1.8.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-common-1.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/univocity-parsers-2.7.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-math3-3.4.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-io-2.4.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-hadoop-1.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/netty-all-4.1.17.Final.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-mapreduce-client-app-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-core-asl-1.9.13.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/breeze_2.11-0.13.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/osgi-resource-locator-1.0.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/core-1.1.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/xbean-asm6-shaded-4.8.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/py4j-0.10.7.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-catalyst_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/shapeless_2.11-2.3.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/curator-framework-2.6.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-codec-1.10.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/guava-14.0.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-beanutils-core-1.8.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/snappy-java-1.1.7.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-kvstore_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hppc-0.7.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/aopalliance-repackaged-2.4.0-b34.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/curator-client-2.6.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spire_2.11-0.13.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/zookeeper-3.4.6.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-client-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/arrow-memory-0.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/avro-mapred-1.8.2-hadoop2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-collections-3.2.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/api-asn1-api-1.0.0-M20.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/apacheds-kerberos-codec-2.0.0-M15.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-databind-2.6.7.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-module-paranamer-2.7.9.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hk2-locator-2.4.0-b34.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-column-1.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jcl-over-slf4j-1.7.16.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/scala-compiler-2.11.12.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-crypto-1.0.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-container-servlet-core-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/avro-1.8.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/api-util-1.0.0-M20.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/compress-lzf-1.0.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/javax.ws.rs-api-2.0.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/metrics-core-3.1.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-common-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-net-3.1.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/gson-2.2.4.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-launcher_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/ivy-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/javax.inject-2.4.0-b34.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/metrics-json-3.1.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-lang-2.6.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/lz4-java-1.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/antlr4-runtime-4.7.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-repl_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/slf4j-api-1.7.16.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/parquet-encoding-1.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/zstd-jni-1.3.2-2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/leveldbjni-all-1.8.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-mapreduce-client-core-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-tags_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hk2-utils-2.4.0-b34.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/arrow-format-0.10.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/scala-reflect-2.11.12.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-sql_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spire-macros_2.11-0.13.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-mllib-local_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-mapper-asl-1.9.13.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/flatbuffers-1.2.0-3f79e055.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/kryo-shaded-4.0.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-common-2.22.2.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/aircompressor-0.10.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/netty-3.9.9.Final.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/slf4j-log4j12-1.7.16.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/metrics-jvm-3.1.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-network-common_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-core_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/hadoop-yarn-api-2.6.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jackson-annotations-2.6.7.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/json4s-jackson_2.11-3.5.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/spark-network-shuffle_2.11-2.4.0.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/joda-time-2.9.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/commons-lang3-3.5.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/json4s-scalap_2.11-3.5.3.jar
file:/Users/hay531/Downloads/spark-2.4.0/assembly/target/scala-2.11/jars/jersey-server-2.22.2.jar
However, there was no conflicting jar name in the list. So:
Why is it that I'm getting an existing classpath entries conflict?
How can I resolve 1?
In case there actually are existing conflicts, how do I force scala/spark to load the jar which I specify on the CLI?
weird problem: While my play application tries to insert/update records from some mongoDB collections while using reactivemongo, the operation seems to fail with a mysterious message, but the record does, actually, gets inserted/updated.
More info:
Insert to problematic collections from the mongo console works well
reading from all collections works well
reading and writing to other collections in the same db works well
writing to the problematic collections used to work.
Error message is:
play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[LastError: DatabaseException['<none>']]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:280)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:206)
at play.core.server.netty.PlayRequestHandler$$anonfun$2$$anonfun$apply$1.applyOrElse(PlayRequestHandler.scala:100)
at play.core.server.netty.PlayRequestHandler$$anonfun$2$$anonfun$apply$1.applyOrElse(PlayRequestHandler.scala:99)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:344)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:343)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
Caused by: reactivemongo.api.commands.LastError: DatabaseException['<none>']
Using ReactiveMongo 0.11.14, Play 2.5.4, Scala 2.11.7, MongoDB 3.4.0.
Thanks!
UPDATE - The mystery thickens!
Based on #Yaroslav_Derman's answer, I added a .recover clause, like so:
collectionRef.flatMap( c =>
c.update( BSONDocument("_id" -> publicationWithId.id.get), publicationWithId.asInstanceOf[PublicationItem], upsert=true))
.map(wr => {
Logger.warn("Write Result: " + wr )
Logger.warn("wr.inError: " + wr.inError)
Logger.warn("*************")
publicationWithId
}).recover({
case de:DatabaseException => {
Logger.warn("DatabaseException: " + de.getMessage())
Logger.warn("Cause: " + de.getCause())
Logger.warn("Code: " + de.code)
publicationWithId
}
})
The recover clause does get called. Here's the log:
[info] application - Saving pub t3
[warn] application - *************
[warn] application - Saving publication Publication(Some(BSONObjectID("5848101d7263468d01ff390d")),t3,2016-12-07,desc,auth,filename,None)
[info] application - Resolving database...
[info] application - Resolving database...
[warn] application - DatabaseException: DatabaseException['<none>']
[warn] application - Cause: null
[warn] application - Code: None
So no cause, no code, message is "'<none>'", but still an error. What gives?
I tried to move to 0.12, but that caused some compilation errors across the app, plus I'm not sure that would solve the problem. So I'd like to understand what's wrong first.
UPDATE #2:
Migrated to reactive-mongo 0.12.0. Problem persists.
Problem solved by downgrading to MongoDB 3.2.8. Turns out reactiveMongo 0.12.0 is not compatible with mongoDB 3.4.
Thanks everyone who looked into this.
For play reactivemongo 0.12.0 you can do like this
def appsDb = reactiveMongoApi.database.map(_.collection[JSONCollection](DesktopApp.COLLECTION_NAME))
def save(id: String, user: User, body: JsValue) = {
val version = (body \ "version").as[String]
val app = DesktopApp(id, version, user)
appsDb.flatMap(
_.insert(app)
.map(_ => app)
.recover(processError)
)
}
def processError[T]: PartialFunction[Throwable, T] = {
case ex: DatabaseException if ex.code.contains(10054 | 10056 | 10058 | 10107 | 13435 | 13436) =>
//Custom exception which processed in Error Handler
throw new AppException(ResponseCode.ALREADY_EXISTS, "Entity already exists")
case ex: DatabaseException if ex.code.contains(10057 | 15845 | 16550) =>
//Custom exception which processed in Error Handler
throw new AppException(ResponseCode.ENTITY_NOT_FOUND, "Entity not found")
case ex: Exception =>
//Custom exception which processed in Error Handler
throw new InternalServerErrorException(ex.getMessage)
}
Also you can add logs in processError method
LastError was deprecated in 0.11, replaced by WriteResult.
LastError does not, actually, means error, it could mean successful result, you need to check inError property of the LastError object to detect if it's real error. As I see, the '<none>' error message give a good chance that this is not error.
Here is the example "how it was in 0.10": http://reactivemongo.org/releases/0.10/documentation/tutorial/write-documents.html
For test purpose, I would like to use BigQuery Connector to write Parquet Avro logs in BigQuery. As I'm writing there is no way to read directly Parquet from the UI to ingest it so I'm writing a Spark job to do so.
In Scala, for the time being, job body is the following:
val events: RDD[RichTrackEvent] =
readParquetRDD[RichTrackEvent, RichTrackEvent](sc, googleCloudStorageUrl)
val conf = sc.hadoopConfiguration
conf.set("mapred.bq.project.id", "myproject")
// Output parameters
val projectId = conf.get("fs.gs.project.id")
val outputDatasetId = "logs"
val outputTableId = "test"
val outputTableSchema = LogSchema.schema
// Output configuration
BigQueryConfiguration.configureBigQueryOutput(
conf, projectId, outputDatasetId, outputTableId, outputTableSchema
)
conf.set(
"mapreduce.job.outputformat.class",
classOf[BigQueryOutputFormat[_, _]].getName
)
events
.mapPartitions {
items =>
val gson = new Gson()
items.map(e => gson.fromJson(e.toString, classOf[JsonObject]))
}
.map(x => (null, x))
.saveAsNewAPIHadoopDataset(conf)
As the BigQueryOutputFormat isn't finding the Google Credentials, it fallbacks on metadata host to try to discover them with the following stacktrace:
016-06-13 11:40:53 WARN HttpTransport:993 - exception thrown while executing request
java.net.UnknownHostException: metadata
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589 at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
at sun.net.www.http.HttpClient.New(HttpClient.java:308)
at sun.net.www.http.HttpClient.New(HttpClient.java:326)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
at com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:93)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:972)
at com.google.cloud.hadoop.util.CredentialFactory$ComputeCredentialWithRetry.executeRefreshToken(CredentialFactory.java:160)
at com.google.api.client.auth.oauth2.Credential.refreshToken(Credential.java:489)
at com.google.cloud.hadoop.util.CredentialFactory.getCredentialFromMetadataServiceAccount(CredentialFactory.java:207)
at com.google.cloud.hadoop.util.CredentialConfiguration.getCredential(CredentialConfiguration.java:72)
at com.google.cloud.hadoop.io.bigquery.BigQueryFactory.createBigQueryCredential(BigQueryFactory.java:81)
at com.google.cloud.hadoop.io.bigquery.BigQueryFactory.getBigQuery(BigQueryFactory.java:101)
at com.google.cloud.hadoop.io.bigquery.BigQueryFactory.getBigQueryHelper(BigQueryFactory.java:89)
at com.google.cloud.hadoop.io.bigquery.BigQueryOutputCommitter.<init>(BigQueryOutputCommitter.java:70)
at com.google.cloud.hadoop.io.bigquery.BigQueryOutputFormat.getOutputCommitter(BigQueryOutputFormat.java:102)
at com.google.cloud.hadoop.io.bigquery.BigQueryOutputFormat.getOutputCommitter(BigQueryOutputFormat.java:84)
at com.google.cloud.hadoop.io.bigquery.BigQueryOutputFormat.getOutputCommitter(BigQueryOutputFormat.java:30)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1135)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:357)
at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1078)
It is of course expected but it should be able to use my service account and its key as GoogleCredential.getApplicationDefault() returns appropriate credentials fetched from GOOGLE_APPLICATION_CREDENTIALS environment variable.
As the connector seems to read credentials, from hadoop configuration, what's the keys to set so that it reads GOOGLE_APPLICATION_CREDENTIALS ? Is there a way to configure the output format to use a provided GoogleCredential object ?
If I understand your question correctly - you might want to set:
<name>mapred.bq.auth.service.account.enable</name>
<name>mapred.bq.auth.service.account.email</name>
<name>mapred.bq.auth.service.account.keyfile</name>
<name>mapred.bq.project.id</name>
<name>mapred.bq.gcs.bucket</name>
Here, the mapred.bq.auth.service.account.keyfile should point to the full file path to the older-style "P12" keyfile; alternatively, if you're using the newer "JSON" keyfiles, you should replace the "email" and "keyfile" entries with the single mapred.bq.auth.service.account.json.keyfile key:
<name>mapred.bq.auth.service.account.enable</name>
<name>mapred.bq.auth.service.account.json.keyfile</name>
<name>mapred.bq.project.id</name>
<name>mapred.bq.gcs.bucket</name>
Also you might want to take a look at https://github.com/spotify/spark-bigquery - which is much more civilised way of working with BQ and Spark. The setGcpJsonKeyFile method used in this case is the same JSON file you'd set for mapred.bq.auth.service.account.json.keyfile if using the BQ connector for Hadoop.
I am using Neo4j(embedded) Enterprise edition 1.9.4 along with Scala-Neo4j wrapper in my project. I tried to backup the Neo4j data using Java like below
def backup_data()
{
val backupPath: File = new File("D:/neo4j-enterprise-1.9.4/data/backup/")
val backup = OnlineBackup.from( "127.0.0.1" )
if(backupPath.list().length > 0)
{
backup.incremental( backupPath.getPath() )
}
else
{
backup.full( backupPath.getPath() );
}
}
It is working fine for the full backup. But the incremental backup part is throwing the Null pointer exception.
Where did I go wrong?
EDIT
Building the GraphDatabase instance through Scala-Neo4j wrapper
class MyNeo4jClass extends SomethingClass with Neo4jWrapper with EmbeddedGraphDatabaseServiceProvider {
def neo4jStoreDir = "/tmp/temp-neo-test"
. . .
}
Stacktrace
Exception in thread "main" java.lang.NullPointerException
at org.neo4j.consistency.checking.OwnerChain$3.checkReference(OwnerChain.java:111)
at org.neo4j.consistency.checking.OwnerChain$3.checkReference(OwnerChain.java:106)
at org.neo4j.consistency.report.ConsistencyReporter$DiffReportHandler.checkReference(ConsistencyReporter.java:330)
at org.neo4j.consistency.report.ConsistencyReporter.dispatchReference(ConsistencyReporter.java:109)
at org.neo4j.consistency.report.PendingReferenceCheck.checkReference(PendingReferenceCheck.java:50)
at org.neo4j.consistency.store.DirectRecordReference.dispatch(DirectRecordReference.java:39)
at org.neo4j.consistency.report.ConsistencyReporter$ReportInvocationHandler.forReference(ConsistencyReporter.java:236)
at org.neo4j.consistency.report.ConsistencyReporter$ReportInvocationHandler.dispatchForReference(ConsistencyReporter.java:228)
at org.neo4j.consistency.report.ConsistencyReporter$ReportInvocationHandler.invoke(ConsistencyReporter.java:192)
at $Proxy17.forReference(Unknown Source)
at org.neo4j.consistency.checking.OwnerChain.check(OwnerChain.java:143)
at org.neo4j.consistency.checking.PropertyRecordCheck.checkChange(PropertyRecordCheck.java:57)
at org.neo4j.consistency.checking.PropertyRecordCheck.checkChange(PropertyRecordCheck.java:35)
at org.neo4j.consistency.report.ConsistencyReporter.dispatchChange(ConsistencyReporter.java:101)
at org.neo4j.consistency.report.ConsistencyReporter.forPropertyChange(ConsistencyReporter.java:382)
at org.neo4j.consistency.checking.incremental.StoreProcessor.checkProperty(StoreProcessor.java:61)
at org.neo4j.consistency.checking.AbstractStoreProcessor.processProperty(AbstractStoreProcessor.java:95)
at org.neo4j.consistency.store.DiffRecordStore$DispatchProcessor.processProperty(DiffRecordStore.java:207)
at org.neo4j.kernel.impl.nioneo.store.PropertyStore.accept(PropertyStore.java:83)
at org.neo4j.kernel.impl.nioneo.store.PropertyStore.accept(PropertyStore.java:43)
at org.neo4j.consistency.store.DiffRecordStore.accept(DiffRecordStore.java:159)
at org.neo4j.kernel.impl.nioneo.store.RecordStore$Processor.applyById(RecordStore.java:180)
at org.neo4j.consistency.store.DiffStore.apply(DiffStore.java:73)
at org.neo4j.kernel.impl.nioneo.store.StoreAccess.applyToAll(StoreAccess.java:174)
at org.neo4j.consistency.checking.incremental.IncrementalDiffCheck.execute(IncrementalDiffCheck.java:43)
at org.neo4j.consistency.checking.incremental.DiffCheck.check(DiffCheck.java:39)
at org.neo4j.consistency.checking.incremental.intercept.CheckingTransactionInterceptor.complete(CheckingTransactionInterceptor.java:160)
at org.neo4j.kernel.impl.transaction.xaframework.InterceptingXaLogicalLog$1.intercept(InterceptingXaLogicalLog.java:79)
at org.neo4j.kernel.impl.transaction.xaframework.XaLogicalLog$LogDeserializer.readAndWriteAndApplyEntry(XaLogicalLog.java:1120)
at org.neo4j.kernel.impl.transaction.xaframework.XaLogicalLog.applyTransaction(XaLogicalLog.java:1292)
at org.neo4j.kernel.impl.transaction.xaframework.XaResourceManager.applyCommittedTransaction(XaResourceManager.java:766)
at org.neo4j.kernel.impl.transaction.xaframework.XaDataSource.applyCommittedTransaction(XaDataSource.java:246)
at org.neo4j.com.ServerUtil.applyReceivedTransactions(ServerUtil.java:423)
at org.neo4j.backup.BackupService.unpackResponse(BackupService.java:453)
at org.neo4j.backup.BackupService.incrementalWithContext(BackupService.java:388)
at org.neo4j.backup.BackupService.doIncrementalBackup(BackupService.java:286)
at org.neo4j.backup.BackupService.doIncrementalBackup(BackupService.java:273)
at org.neo4j.backup.OnlineBackup.incremental(OnlineBackup.java:147)
at Saddahaq.User_node$.backup_data(User_node.scala:1637)
at Saddahaq.User_node$.main(User_node.scala:2461)
at Saddahaq.User_node.main(User_node.scala)
After the backup is taken, the backed target is checked for consistency. The incremental version of the consistency checker currently suffers from a bug leading to the observed NPE.
Workaround: either always take full backups with backup.full or prevent consistency checking on incremental backups by using
backup.incremental(backupPath.getPath(), false);