NoSuchMethodError while running Scalatest - scala

I have created one small program and in order to test it I have write small Scala Test class. But when I tried to execute scala test I was getting below error, please advise,
java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
at org.scalatest.tools.Runner$.argTooShort$1(Runner.scala:1490)
at org.scalatest.tools.Runner$.parseReporterArgsIntoConfigurations(Runner.scala:1507)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:898)
at org.scalatest.tools.Runner$.run(Runner.scala:858)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:137)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)

I got the resolution. Thanks all for you reply.
There was a problem with my Scatatest version.
I am using Scala version 11 and scalatest version is not compatible with with Scala version.
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.1" % "test"
Above line added in .sbt file and refreshed. Now it works fine as expected.

Related

Error while running Spark application in Intellij - NoClassDefFoundError com/google/common/util/concurrent/internal/InternalFutureFailureAccess

I have an Apache Spark 2.0 application written in Scala (2.11.12), built using SBT tool 1.2.8. When I'm trying to run the app in Intellij (2020.3.2 Ultimate), I get the following error -
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/util/concurrent/internal/InternalFutureFailureAccess
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
.....
Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.internal.InternalFutureFailureAccess
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
On googling/searching Stackoverflow, it seems this is caused by some weird Guava dependency issues. I have these added to my Dependencies.scala
dependencies += "com.google.guava" % "guava" % "30.1-jre"
dependencies += "com.google.guava" % "failureaccess" % "1.0"
That didn't solve the issue. Also tried adding "com.google.guava" % "listenablefuture" % "1.0" to the dependencies, but that didn't help either. Tried doing File -> Invalidate Caches/Restart in Intellij but I still get the issue.
Could someone please help?
In my case, adding com.google.guava:failureaccess to external libraries to my project (File -> Project Structure -> Libraries) helped.

NoClassDefFoundError: org/apache/hadoop/fs/StreamCapabilities while reading s3 Data with spark

I would like to run a simple spark job on my local dev machine (through Intellij) reading data from Amazon s3.
my build.sbt file:
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.1",
"org.apache.spark" %% "spark-sql" % "2.3.1",
"com.amazonaws" % "aws-java-sdk" % "1.11.407",
"org.apache.hadoop" % "hadoop-aws" % "3.1.1"
)
my code snippet:
val spark = SparkSession
.builder
.appName("test")
.master("local[2]")
.getOrCreate()
spark
.sparkContext
.hadoopConfiguration
.set("fs.s3n.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem")
val schema_p = ...
val df = spark
.read
.schema(schema_p)
.parquet("s3a:///...")
And I get the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/StreamCapabilities
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2093)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2058)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2152)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2580)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2593)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2632)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2614)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:45)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:354)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:622)
at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:606)
at Test$.delayedEndpoint$Test$1(Test.scala:27)
at Test$delayedInit$body.apply(Test.scala:4)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Test$.main(Test.scala:4)
at Test.main(Test.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.StreamCapabilities
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 41 more
When replacing s3a:/// to s3:/// I get another error: No FileSystem for scheme: s3
As I am new to AWS, I do not know if I should user s3:///, s3a:/// or s3n:///. I have already setup my AWS credentials with aws-cli.
I have not any Spark installation on my machine.
Thanks in advance for your help
I would start by looking at the S3A troubleshooting docs
Do not attempt to “drop in” a newer version of the AWS SDK than that which the Hadoop version was built with Whatever problem you have, changing the AWS SDK version will not fix things, only change the stack traces you see.
whatever version of the hadoop- JARs you have on your local spark installation, you need to have exactly the same version of hadoop-aws, and exactly the same version of the aws SDK which hadoop-aws was built with. Try mvnrepository for the details.
For me, it got solved by adding the following dependency in pom.xml besides the above:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.1</version>
</dependency>
In my case I fixed it choosing correct dependency versions:
"org.apache.spark" % "spark-core_2.11" % "2.4.0",
"org.apache.spark" % "spark-sql_2.11" % "2.4.0",
"org.apache.hadoop" % "hadoop-common" % "3.2.1",
"org.apache.hadoop" % "hadoop-aws" % "3.2.1"

com/mongodb/casbah/Imports$ ClassNotFound running spark-submit and Mongo

Im having a issue when try to run a jar using spark-submit. This is my sbt file:
name := "Reading From Mongo Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.mongodb" %% "casbah" % "2.5.0"
Im using
sbt package
to create jar file. And all looks good. Then, I executed it this way:
spark-submit --class "ReadingFromMongo" --master local /home/bigdata1/ReadingFromMongoScala/target/scala-2.10/reading-from-mongo-project_2.10-1.0.jar
And got this error:
Error: application failed with exception
java.lang.NoClassDefFoundError: com/mongodb/casbah/Imports$
at ReadingFromMongo$.main(ReadingFromMongo.scala:6)
at ReadingFromMongo.main(ReadingFromMongo.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:577)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:174)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.casbah.Imports$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 11 more
My ReadingFromMongo class is this one:
import com.mongodb.casbah.Imports._
object ReadingFromMongo {
def main(args: Array[String]) {
val mongoClient = MongoClient("mongocluster", 27017)
val db = mongoClient("Grupo12")
val coll = db("test")
println("\n\Total: "+coll.count()+"\n")
}
}
I dont know why is this happening. This is the first time Im facing this kind of problem.
Hope someone can help me.
Thanks a lot.
sbt package creates jar with your code, excluding dependencies. So, spark does not know where to take mongo dependencies.
You need either: include mongo and other required dependencies into classpath, or build "fat jar" that will include deps classes.
sbt-assembly plugin help you if you choose second approach.

AKKA 2.4.8 / Scala 2.12.0-M5 can't start system

I am trying to use Scala 2.12.0-M5 and AKKA 2.4.7 in a project. But I get this error when I try so start AKKA. I also tried using M4.
I am sure I must be missing something in my setup, as this clearly must work. But this is pretty much just what I had using 2.11.8 - 2.4.6.
Any help would be appreciated, thanks!
build.sbt:
name := "AKKA-2.4.8"
version := "1.0"
scalaVersion := "2.12.0-M5"
// https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor_2.11
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.11" % "2.4.8"
code:
package testing
import akka.actor.ActorSystem
/**
* Created by on 7/8/16.<br>
* <br>
* AkkaActor demonstrates my problem when starting AKKA
*/
object AkkaActorStarter extends App {
val actorSystem = ActorSystem("testAkka")
}
Error:
[error] (run-main-0) java.lang.NoClassDefFoundError: scala/Product$class
...
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at akka.util.Timeout.<init>(Timeout.scala:13)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:522)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at testing.AkkaActorStarter$.delayedEndpoint$testing$AkkaActorStarter$1(AkkaActorStarter.scala:11)
at testing.AkkaActorStarter$delayedInit$body.apply(AkkaActorStarter.scala:10)
at scala.Function0.apply$mcV$sp$(Function0.scala:34)
at scala.Function0.apply$mcV$sp(Function0.scala:34)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:376)
at scala.App.main$(App.scala:76)
at scala.App.main(App.scala:74)
at testing.AkkaActorStarter.main(AkkaActorStarter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
The problem is, that you are using akka for Scala 2.11 (akka-actor_2.11) with Scala 2.12. Scala minor versions are not binary compatible, you have to use the akka library that is compiled for your exact Scala version, 2.12.0-M5: "com.typesafe.akka" % "akka-actor_2.12.0-M5" % "2.4.8" or use %%, that will use the proper artifact according to your scalaVersion: "com.typesafe.akka" %% "akka-actor" % "2.4.8"

Activator UI ,java.lang.NoClassDefFoundError:

I'm trying to create basic scala project in intellij by using the Activator UI
I'm importing the project to the ide and it compile well
But when im trying to run simple code im getting
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at reactivemongo.api.MongoDriver$.reactivemongo$api$MongoDriver$$defaultSystem(api.scala:378)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at scala.Option.getOrElse(Option.scala:120)
at reactivemongo.api.MongoDriver.<init>(api.scala:305)
at example.App$.main(App.scala:10)
at example.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
When the project is loaded there is an error in the project structure
sbt:scala 2.11.2 not in use
What went wrong with the activator ui intellij project generation ?
thanks
miki
I came across this when trying to run spark. This is a incompatibility error between the scala version which was used to compile the dependancy and the version of scala which is used to run your project.
Removing my scala version specification was a hacky way to solve the problem:
// build.sbt
name := "SparkTest"
version := "1.0"
scalaVersion := "2.11.4" <-- remove this
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"