NoSuchMethodError when using Spark and IntelliJ - scala

I'm new to Scala and Spark. I've been frustrated by how hard it has been to get things to work with IntelliJ. Currently, I can't get run the code below. I'm sure it's something simple, but I can't get it to work.
I'm trying to run:
import org.apache.spark.{SparkConf, SparkContext}
object TestScala {
def main(args: Array[String]): Unit = {
val conf = new SparkConf()
conf.setAppName("Datasets Test")
conf.setMaster("local[2]")
val sc = new SparkContext(conf)
println(sc)
}
}
The error I get is:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
at TestScala$.main(TestScala.scala:13)
at TestScala.main(TestScala.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
My build.sbt file:
name := "sparkBook"
version := "1.0"
scalaVersion := "2.12.1"

Change your scalaVersion to 2.11.8 and add the Spark dependency to your build.sbt:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"

One more scenario is intellij is pointing to 2.12.4 and all the maven/sbt dependencies are 2.11.8. with scala dep verion 2.11...
I stepped back from 2.12.4 to 2.11.8 at global libraries of intellij ui. and it started working
Details :
Maven pom.xml pointing to 2.11.8 But in my Intellij... sdk is 2.12.4 in global libraries shown below.
Which is causing
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Stepped back to 2.11.8 in Global libraries.. like below
Thats it.. Problem solved. No more error for executing that program.
Conclusion : Maven dependencies alone should not solve the problem, along with that we have to configure scala sdk in global
libraries since its error is coming while running a spark local
program and error is related to Intellij run time.

If you use spark 2.4.3, you need to use scala 2.11 even though spark website says to use scala 2.12. https://spark.apache.org/docs/latest/
To avoid scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;

Related

How do you properly set up Scala Spark libraryDependencies with the correct version of Scala?

I'm new to Scala Spark and I'm trying to create an example project using Intellij. During Project creation I choose Scala and Sbt with Scala version 2.12 but When I tried adding spark-streaming version 2.3.2 if kept erroring out so I Google'd around and on Apache's website I found the sbt config shown below and I'm still getting the same error.
Error: Could not find or load main class SparkStreamingExample
Caused by: java.lang.ClassNotFoundException: SparkStreamingExample
How can it be determined which version of Scala works with which version of Spark Dependencies?
name := "SparkStreamExample"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming_2.11" % "2.3.2"
)
My Object class is very basic doesn't have much to it...
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
object SparkStreamingExample extends App {
println("SPARK Streaming Example")
}
You can see the version of Scala that is supported by Spark in the Spark documentation.
As of this writing, the documentation says:
Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.3.2 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
Notice that only Scala 2.11.x is supported.

SBT dependency error for Scala Spark on Intellij

I am a noob to Spark and Intellij.I want to run Spark using Scala
I initially installed Scala 2.12 and created the SBT accordingly.Then I got a NoSuchMethod runtime error on
sc = new SparkContext(conf)
From the solution posted NoSuchMethodError when using Sparka and IntelliJ I used Scala version 2.11.3 while creating the project and used the SBT
version := "0.1"
scalaVersion := "2.11.3"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"
I am now getting the error
Error:scalac: No 'scala-library*.jar' in Scala compiler classpath in Scala SDK SBT: org.scala-lang:scala-library:2.11.3:jar
This is the library on the External libraries section
I tried creating the project from scratch and Cache Invalidate/Restart option.Same result
I also tried downloading via Maven through File -> Project Structure.Only found spark-core 2.10.Showed the same NoSuchMethod Error
Can anyone identify the problem?
It clearly says that, It could not find scala-library*.jar file
So go to
"Project Structure -> Modules"
And see these jar files
scala-compiler.jar, scala-library.jar, scala-reflect.jar
If they are absent from the Modules add them manually or Reinstall the scala and scala plugin.
Hope this should work!

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

Any reason why I get this error ? Initially the IDE plugin for Scala was 2.12.3. But since I'm working with Spark 2.2.0, I manually changed it to Scala 2.11.11.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at scala.xml.Null$.<init>(Null.scala:23)
at scala.xml.Null$.<clinit>(Null.scala)
at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
at sparkEnvironment$.<init>(Ticket.scala:33)
at sparkEnvironment$.<clinit>(Ticket.scala)
at Ticket$.main(Ticket.scala:39)
at Ticket.main(Ticket.scala)
Make sure Spark is compatible with corresponding Scala version
The error is common when using Scala version 2.12 series with any version of Spark offering Scala 2.11.
You can try using the 2.11 series of Scala with Spark . i.e.
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
As you can see in this dependency spark-core_2.11 is associated with scala version 2.11.
That's why it's safer (more compatible) to use %% and avoid hardcoding the version of Scala in Spark dependencies. Let the tool resolve the required Scala version for you automatically as follows:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
The above declaration will automatically infer the scala version.
Scala Version 2.12 gave me the similar error, however after removing the extends App and adding the main method, everything is working fine.
object SparkDemo extends App{}
Just remove extends App and add main function
object SparkDemo {
def main(args: Array[String]): Unit = {}
}
Spark Document itself even suggests using main function rather than extends App.
https://spark.apache.org/docs/2.4.0/quick-start.html#:~:text=Note%20that%20applications%20should%20define%20a%20main()%20method%20instead%20of%20extending%20scala.App.%20Subclasses%20of%20scala.App%20may%20not%20work%20correctly.
This problem could be caused by any dependency as well; If a dependency was internally compiled with Scala 2.11, you will need to downgrade your version of Scala.
Its a spark dependency issue. Scala 2.12 compatible with spark 3 & scala 2.11 compatible with spark 2

Activator UI ,java.lang.NoClassDefFoundError:

I'm trying to create basic scala project in intellij by using the Activator UI
I'm importing the project to the ide and it compile well
But when im trying to run simple code im getting
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at reactivemongo.api.MongoDriver$.reactivemongo$api$MongoDriver$$defaultSystem(api.scala:378)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at scala.Option.getOrElse(Option.scala:120)
at reactivemongo.api.MongoDriver.<init>(api.scala:305)
at example.App$.main(App.scala:10)
at example.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
When the project is loaded there is an error in the project structure
sbt:scala 2.11.2 not in use
What went wrong with the activator ui intellij project generation ?
thanks
miki
I came across this when trying to run spark. This is a incompatibility error between the scala version which was used to compile the dependancy and the version of scala which is used to run your project.
Removing my scala version specification was a hacky way to solve the problem:
// build.sbt
name := "SparkTest"
version := "1.0"
scalaVersion := "2.11.4" <-- remove this
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"

How to make sure only one Akka JAR makes it on the classpath?

I'm running into a akka.ConfigurationException: Akka JAR version [2.1.0] does not match the provided config version [2.2.3]. My speculation is that, somehow, $SCALA_HOME/lib/akka-actors.jar is making it onto the classpath along with the Akka JAR managed by SBT.
I created a simple standalone SBT project to demonstrate the issue (see below). My $SCALA_HOME points to Scala 2.10.3. In Build.scala I'm explicitly setting scalaHome to $SCALA_HOME
project/Build.scala
import sbt._
import sbt.Keys._
object ApplicationBuild extends Build {
val appName = "akka-version-problem"
val appVersion = "0.1-SNAPSHOT"
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Compile) map { (target, cp) =>
println(cp map { _.data } filter { _.getAbsolutePath.contains("lib") } mkString "\n")
println(cp map { _.data } filter { _.getAbsolutePath.contains("akka") } mkString "\n")
}
lazy val root = Project("root", file(".")).settings(
scalaVersion := "2.10.3",
scalaHome := Some(file(System.getenv("SCALA_HOME"))),
autoScalaLibrary := false,
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.2.3"
),
getJarsTask
)
}
src/main/scala/com/example/Main.scala
package com.example
import akka.actor.ActorSystem
object Main extends App {
val system = ActorSystem("AkkaDemoSystem")
system.shutdown()
}
When I run sbt get-jars I don't see $SCALA_HOME/lib/akka-actors.jar in the output
When I run sbt run I get:
[error] (run-main) 44c2d48a-8899-43f9-804b-55cbf739b08bakka.ConfigurationException: Akka JAR version [2.1.0] does not match the provided config version [2.2.3]
44c2d48a-8899-43f9-804b-55cbf739b08bakka.ConfigurationException: Akka JAR version [2.1.0] does not match the provided config version [2.2.3]
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:172)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:465)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:93)
at com.example.Main$delayedInit$body.apply(Main.scala:6)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.example.Main$.main(Main.scala:5)
at com.example.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Am I missing something obvious? Has anyone else run into this?
The issue here is that akka-actors.jar is part of the scala distribution. Sbt, by default, includes all these jars on your classpath. SO, you wind up with the Akka version from the scala distribution and the one you depend on directly.
Not only that, you're including scala-actors, scala-reflect, etc. If this is what you want, great.
If you'd like to prevent yourself from using more jars than you want, you should create the scalaInstance you use directly. See http://www.scala-sbt.org/release/api/index.html#sbt.ScalaInstance$ for the construction methods (You can see one of these uses the scalaHome).
I'd recommend doing something like (note: sbt 0.13 code):
scalaInstance := {
val homeDir = file("/path/to/scala-home")
val jars = (homeDir ** ".jar").get
val notAkka = jars filterNot (_ contains "akka")
val scalaLib = ScalaInstance.scalaJar(homeDir, "scala-library.jar")
val compilerLib = ScalaInstance.scalaJar(homeDir, "scala-compiler.jar")
ScalaInstance(scalaLib, compilerLib, notAkka:_*)(state.classLoaderCache.apply _)
}
This will fire you a deprecated warning. This is because we assume that if you're using scalaHome you want all those default modules from scala. Since this isn't the case, you can ignore that.
As Viktor says, I'd recommend just using sbt's resolution mechanism for Scala. The caching is of classloaders is done by default if you do this, and it will only download the artifact once. The reality is, that sbt has to download a version of scala anyway to compile your build before it knows you've configured a scalaHome for the project (since you specify this using scala itself).
Hope that helps!