NoSuchMethod exception in Flink when using dataset with custom object array - scala

I have a problem with Flink
java.lang.NoSuchMethodError: org.apache.flink.api.java.typeutils.ObjectArrayTypeInfo.getInfoFor(Lorg/apache/flink/api/common/typeinfo/TypeInformation;)Lorg/apache/flink/api/java/typeutils/ObjectArrayTypeInfo;
at LowLevel.FlinkImplementation.FlinkImplementation$$anon$6.<init>(FlinkImplementation.scala:28)
at LowLevel.FlinkImplementation.FlinkImplementation.<init>(FlinkImplementation.scala:28)
at IRLogic.GmqlServer.<init>(GmqlServer.scala:15)
at it.polimi.App$.main(App.scala:20)
at it.polimi.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
the line with the problem is this one
implicit val regionTypeInformation =
api.scala.createTypeInformation[FlinkDataTypes.FlinkRegionType]
in the FlinkRegionType I have an Array of custom object
I developed the app with the maven plugin in the IDE and everything is working good, but when I move to the version I downloaded from the website I get the error above
I am using Flink 0.9
I was thinking that some library may be missing but I am using maven for handling everything. Moreover running through the code of ObjectArrayTypeInfo.java it doesn't seem to be the problem

A NoSuchMethodError commonly indicates a version mismatch between the libraries a Flink program was compiled with and the system the program is executed on. Especially if the same code works in an IDE setup where compile and execution libraries are the same.
In such case, you should check the version of the Flink dependencies, for example in the Maven POM file.

Related

ClassNotFoundException while creating Spark Session

I am trying to create a Spark Session in Unit Test case using the below code
val spark = SparkSession.builder.appName("local").master("local").getOrCreate()
but while running the tests, I am getting the below error:
java.lang.ClassNotFoundException: org.apache.hadoop.fs.GlobalStorageStatistics$StorageStatisticsProvider
I have tried to add the dependency but to no avail. Can someone point out the cause and the solution to this issue?
It can be because of two reasons.
1. You may have incompatible versions of spark and Hadoop stacks. For example, HBase 0.9 is incompatible with spark 2.0. It will result in the class/method not found exceptions.
2. You may have multiple version of the same library because of dependency hell. You may need to run the dependency tree to make sure this is not the case.

Error running Scala tests in Intelli-J

I'm having some problems getting my Scala tests running via the Intelli-J Run/Debug configuration. The tests are working if I run them directly in the SBT console.
My configuration looks like this:
I'm getting this error in the Run console panel:
java.lang.IllegalArgumentException: ERROR: -r has been deprecated for a very long time and is no longer supported, to prepare for reusing it for a different purpose in the near future. Please change all uses of -r to -C.
at org.scalatest.tools.ArgsParser$.checkArgsForValidity(ArgsParser.scala:41)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:857)
at org.scalatest.tools.Runner$.run(Runner.scala:850)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:141)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:32)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
I've checked my plugins are all up-to-date. Do I need to use a particular version of Scala? Or is there some additional setup I'm missing?
For anyone who has this problem even with newer version of IntelliJ, here is how I went about figuring out where the issue was. I pulled in the scalatest dependency at the top of the pom for the module I was trying to run tests in. Then an individual test would run without this error. I moved the dependency down the pom until the problem recurred to figure out which dependency was causing the problem. Then I found that the dependency had some really old dependencies including an older version of scalatest that weren't showing up in my dependency tree. Also, this jar dependency had scalatest as a dependency not marked with scope test.
It appears to have been a problem with the Scala plugin under Intelli-J 13.
I fixed this by upgrading to Intelli-J 2016.3, which I presume has been changed to pass the newer -C switch to ScalaTestRunner.
#JasonF pointed out in his comment below that a project dependency can also cause a problem with the scalatest plugin (this was the case for him). It's worth attempting to run the tests of a fresh sample Scala project to test for this scenario before upgrading the IDE.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD

Please note that I am better dataminer than programmer.
I am trying to run examples from book "Advanced analytics with Spark" from author Sandy Ryza (these code examples can be downloaded from "https://github.com/sryza/aas"),
and I run into following problem.
When I open this project in Intelij Idea and try to run it, I get error "Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD"
Does anyone know how to solve this issue ?
Does this mean i am using wrong version of spark ?
First when I tried to run this code, I got error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/product", but I solved it by setting scala-lib to compile in maven.
I use Maven 3.3.9, Java 1.7.0_79 and scala 2.11.7 , spark 1.6.1. I tried both Intelij Idea 14 and 15 different versions of java (1.7), scala (2.10) and spark, but to no success.
I am also using windows 7.
My SPARK_HOME and Path variables are set, and i can execute spark-shell from command line.
The examples in this book will show a --master argument to sparkshell, but you will need to specify arguments as appropriate for your environment. If you don’t have Hadoop installed you need to start the spark-shell locally. To execute the sample you can simply pass paths to local file reference (file:///), rather than a HDFS reference (hdfs://)
The author suggest an hybrid development approach:
Keep the frontier of development in the REPL, and, as pieces of code
harden, move them over into a compiled library.
Hence the samples code are considered as compiled libraries rather than standalone application. You can make the compiled JAR available to spark-shell by passing it to the --jars property, while maven is used for compiling and managing dependencies.
In the book the author describes how the simplesparkproject can be executed:
use maven to compile and package the project
cd simplesparkproject/
mvn package
start the spark-shell with the jar dependencies
spark-shell --master local[2] --driver-memory 2g --jars ../simplesparkproject-0.0.1.jar ../README.md
Then you can access you object within the spark-shell as follows:
val myApp = com.cloudera.datascience.MyApp
However if you want to execute the sample code as Standalone application and execute it within idea you need to modify the pom.xml.
Some of dependencies are required for compilation, but are available in an spark runtime environment. Therefore these dependencies are marked with scope provided in the pom.xml.
<!--<scope>provided</scope>-->
you can remake the provided scope, than you will be able to run the samples within idea. But you can not provide this jar as dependency for the spark shell anymore.
Note: using maven 3.0.5 and Java 7+. I had problems with maven 3.3.X version with the plugin versions.

How fix sigar library when I run spray application?

I have a sbt project written in scala. The project uses akka and spray. There is a class with main function. When I run scala console application sometimes I get
[on-spray-can-akka.actor.default-dispatcher-4] [DEBUG] [2014-11-07 16:48:30,336] Sigar: no sigar-amd64-winnt.dll in java.library.path
org.hyperic.sigar.SigarException: no sigar-amd64-winnt.dll in java.library.path
I do not change anything run it again and it runs well. So it can be run successful or fail several times on end. How to fix this?
UPDATED
Also when it start normal there is a message:
[INFO] [11/07/2014 17:02:36.772] [on-spray-can-akka.actor.default-dispatcher-2]
[Cluster(akka://myApp)] Cluster Node [akka.tcp://myApp#127.0.0.1:2551] - Metrics will be
retreived from MBeans, and may be incorrect on some platforms. To increase metric accuracy
add the 'sigar.jar' to the classpath and the appropriate platform-specific native libary to
'java.library.path'. Reason: java.lang.IllegalArgumentException: java.lang.UnsatisfiedLinkError:
org.hyperic.sigar.Sigar.getPid()J
Sigar is a native library for gathering performance stats, used by Typesafe Console atmos Scala library. If you're not interested in hooking up Typesafe Console to your application, you can simply remove all references to atmos library from sbt build script and app config files without affecting your app functionality.

Intellij will not run ScalaTests - "Incompatible Class Change Error"

I'm trying to run tests for an sbt-based Scala application in intellij. But I get the following error which I'm not sure how to fix:
Testing started at 21:07 ...
java.lang.IncompatibleClassChangeError: Found class scala.collection.mutable.ArrayOps, but interface was expected
at org.scalatest.tools.Runner$.checkArgsForValidity(Runner.scala:895)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:729)
at org.scalatest.tools.Runner$.run(Runner.scala:711)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:144)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
When I decorate my classes with the Junit runner attribute they work fine (and it's actually a much better run in terms of the visuals):
#RunWith(classOf[JUnitRunner])
I got the same error when trying to run tests for my Play Framework app. Drilling down into the stack trace, I found that the problem class was FakeRequest, which is in the play-test library. I had two different versions of the library, one for Play 2.4 and one for 2.3. I was able to resolve this issue by removing the play-test version for Play 2.3 (open Module Settings -> Libraries -> find and delete the bad dependency).
Your issue is probably with some other problematic dependency, but following the same steps as above may help fix it.
This seems to be a problem with scala test runner framework. I had come across the same problem; eventually like you suggested end up using junit test runner to make it work. But the problem in my case was it was pulling in a transitive dependency and no such class error.
Make sure the libraries what you are using for the JUnitRunner are same. Most of the times “Incompatible Class Change Error” occurs because of backward compatibility. And also have a look at scala library jar at the time of compiling and running.