json4s throws NoClassDefFoundError: java/sql/Timestamp - scala

When following the json4s instructions on Serialization I keep getting the followinng error:
java.lang.NoClassDefFoundError: java/sql/Timestamp
at org.json4s.reflect.Reflector$.(Reflector.scala:22)
at org.json4s.reflect.Reflector$.(Reflector.scala)
at org.json4s.Extraction$.internalDecomposeWithBuilder(Extraction.scala:160)
at org.json4s.Extraction$.decomposeWithBuilder(Extraction.scala:65)
at org.json4s.native.Serialization$.write(Serialization.scala:43)
at org.json4s.native.Serialization$.write(Serialization.scala:37)
... 34 elided
Caused by: java.lang.ClassNotFoundException: java.sql.Timestamp
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:436)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 40 more
My case class does not have any java.sql.Timemstamp values.
What is needed to setup scala 2.12, and openjdk 12, with json4s properly?
I have setup my sbt in compliance with the instructions and I am able to import java.sql.Timestamp in my project.
Thank you in advance for your consideration and response.

Related

how to solve the sparkSession class not found in intellij project

While submitting scala spark project I am getting this error
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
After many repeated attempts I am not able to solve this problem. Can anybody help me to submit my project
I have tried revalidating the cache and submitted the project still not solved

org/bson/conversions/Bson error in Apache Zeppelin

I have installed Zeppelin 0.9.0 on my Ubuntu 20.04 machine.
In interpreters spark.jars I have mongo-spark-connector, mongo-java-driver and bson.
I successfully imported com.mongodb.spark, org.bson.Document and other necessary packages, but when I want to execute
val rdd = MongoSpark.load(sc)
appears error:
java.lang.NoClassDefFoundError: org/bson/conversions/Bson
... 66 elided
Caused by: java.lang.ClassNotFoundException: org.bson.conversions.Bson
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
... 66 more
Also, I have spark version 3.1.1, java version 11.0.10, scala version 2.12.10.
I found solution.
I've put the following jars in interpreter/spark/dep folder and it works:
bson-4.3.1.jar
mongodb-driver-core-4.3.1.jar
mongo-java-driver-3.12.10.jar
mongo-spark-connector_2.12-3.0.1.jar
zeppelin-mongodb-0.9.0.jar

Exception java.lang.NoClassDefFoundError: scala/collection/SeqLike in Scala project that does not refer to SeqLike class - why?

I am using Java 12 / Scala 12.3 project https://github.com/dominique-unruh/scala-isabelle/ for the digesting of Isabelle-HOL source code. This is not easy stuff. But project compiles and tests executes correctly.
Then I imported Sext library https://github.com/nikita-volkov/sext/ for pretty-printing Scala object tree (as there is not inner pretty-printing or toJson tree methods in Scala) via addition in build.sbt
libraryDependencies += "com.github.nikita-volkov" % "sext" % "0.2.4"
The project compiles but
val ctxt = Context("Main")
println(ctxt.treeString)
gives error message:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/SeqLike
at sext.package$.SextAnyTreeString(package.scala:109)
at de.unruh.isabelle.Example2theory$.main(Example2theory.scala:151)
at de.unruh.isabelle.Example2theory.main(Example2theory.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.SeqLike
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 3 more
When I check Scala 2.13 documentation https://www.scala-lang.org/files/archive/api/2.13.5/scala/collection/ then there is no SeqLike indeed, 2.12 has it https://www.scala-lang.org/api/2.12.5/scala/collection/SeqLike.html
But from where this SeqLike comes in my Scala code that compiles under 2.13? Sext library does not refer to SeqLike as well. How it is possible that such class is required?

Exception while running StreamingContext.start()

Exception while running python code in Windows 10. I am using Apache Kafka and PySpark.
Python code snippet to read data from Kafka
ssc=StreamingContext(sc,60)
zkQuorum, topic = sys.argv[1:]
kvs=KafkaUtils.createStream(ssc, zkQuorum, "spark-streaming-consumer", {topic: 1})
lines = kvs.map(lambda x: [x[0],x[1]])
lines.pprint()
lines.foreachRDD(SaveRecord)
ssc.start()
ssc.awaitTermination()
Exception while running the code
Exception in thread "streaming-start" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
at org.apache.spark.streaming.kafka.KafkaReceiver.<init>(KafkaInputDStream.scala:69)
at org.apache.spark.streaming.kafka.KafkaInputDStream.getReceiver(KafkaInputDStream.scala:60)
at org.apache.spark.streaming.scheduler.ReceiverTracker.$anonfun$launchReceivers$1(ReceiverTracker.scala:441)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
at scala.collection.TraversableLike.map(TraversableLike.scala:237)
at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
at org.apache.spark.streaming.scheduler.ReceiverTracker.launchReceivers(ReceiverTracker.scala:440)
at org.apache.spark.streaming.scheduler.ReceiverTracker.start(ReceiverTracker.scala:160)
at org.apache.spark.streaming.scheduler.JobScheduler.start(JobScheduler.scala:102)
at org.apache.spark.streaming.StreamingContext.$anonfun$start$1(StreamingContext.scala:583)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.ThreadUtils$$anon$1.run(ThreadUtils.scala:145)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 16 more
This may be due to incompatible version of Scala with Spark. Make sure your Scala Version in Project configuration matches with the Version your Spark Version supports.
Spark requires Scala 2.12; support for Scala 2.11 was removed in Spark 3.0.0
It is also possible that the third party jar (like dstream-twitter for twitter streaming application or your Kafka streaming jar) is built for unsupported version of Scala in your application.
For me dstream-twitter_2.11-2.3.0-SNAPSHOT For Instance didn't work with Spark 3.0, It gave Exception in thread "streaming-start" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class). But when I updated the dtream-twitter jar with scala 2.12 version it solved the issue.
Make sure you get all the Scala Versions correct.

FSDataInputStream ClassNotFoundException in Spark

I am new to spark application programming, and therefore struggling here with this basic one..
I have scala ide and attached relevant jar files from the latest hadoop and spark distributions. There is just one basic scala object that i am working with -
hadoop - 2.7
spark - 2.0.0
I have attempted this with both scenarios, when hadoop processes are running on my laptop and also when they are not running.. its the same behaviour. Btw, spark shell is not complaining of anything
import org.apache.spark.SparkConf
object SparkAppTest {
def main(args : Array[String]) {
val conf = new SparkConf().setAppName("Spark test")
conf.setMaster("spark://master:7077")
conf.setSparkHome("/hadoop/spark")
conf.set("spark.driver.host","localhost")
}
}
When I am trying to "run" this using eclipse -> run as scala app this is failing with the following error -
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at SparkAppTest$.main(SparkAppTest.scala:6)
at SparkAppTest.main(SparkAppTest.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more