Scala package throws java.lang.UnsupportedClassVersionError - scala

Our java application has dependencies on Spark, which is written in Scala. Build tool is Maven, and am running from within Eclipse. The JDK_HOME used to compile the application on the command line using Maven, and the JRE used to run within Eclipse, are both 1.7.0_15.
The Maven POM contains the following:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
...
<configuration>
<scalaVersion>1.10.5</scalaVersion>
<args>
<arg>-target:jvm-1.7</arg>
</args>
</configuration>
</plugin>
I understand that Spark is built using Scala 2.10
The maven dependencies include the following:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.10</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-parser-combinators_2.12.0-M2</artifactId>
<version>1.0.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.3.0</version>
</dependency
>
At runtime, the folowing exception is thrown:
Exception in thread "main" java.lang.UnsupportedClassVersionError: scala/util/parsing/combinator/PackratParsers : Unsupported major.minor version 52.0
I cannot find a 2.10.* version of the scala-parser-combinators jar.
Can anyone assist with the solution?
Thanks!

The scala-parser-combinators_2.12.0-M2 module is part of the Scala 2.12 distribution.
2.12 is targeted for Java 8 - bytecode major version 52, hence the error.
Your best bet is to either use an older Spark distribution or switch to Java 8 (Java 7 is at End-Of-Life since April 2015).
EDIT (addressing question edit): you cannot find an older version of the scala-parser-combinators library, because it was isolated to a stand-alone module at some point after 2.10. You can attempt to simply exclude this dependency in your POM, but there's no guarantee your chosen Spark version will be compatible with this older library version.

Related

MyBatis-Guice fails to initialize with java 17

I am getting the following exception when starting mybatis with java17.
java.lang.NoSuchMethodError: 'void org.mybatis.guice.AbstractMyBatisModule.bindInterceptor(com.google.inject.matcher.Matcher, com.google.inject.matcher.Matcher, org.aopalliance.intercept.MethodInterceptor[])'
Maven dependencies:
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>3.5.11</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-guice</artifactId>
<version>3.18</version>
</dependency>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>5.1.0</version>
</dependency>
I tried downgrading to mybatis-guice version 3.12 but it did not help.
works in intelliJ does not work on a standalone server
The issue there was a 3rd party dependency that was pulling in guice no_aop from 4.0.1. once excluded it worked fine.
Verify no guice dependencies using
mvn dependency:tree

I can't create the Spark Maven-Scala project with IntelliJ 2022.2.2 on Ubuntu

I'm following a course on Udemy and this course is about the Scala-Maven with Spark.
Our instructor is writing codes on IntelliJ. The problem is that my instructor has IntelliJ (2018) and spark(2.2) but these versions are so old. First, I tried with the old IntelliJ version and it worked but then I decided to use the current Spark and IntelliJ because we are in 2022.
So, I downloaded the İntellij 2022 version but I couldn't open the Scala-Maven project on IntelliJ. (Also, I downloaded the current Spark version)
I watched some YouTube videos on the internet about how we can create a Scala-Maven project on İntellij and the problem is when they create a new project on the IntelliJ they are not choosing the archetypes which are in the picture https://i.stack.imgur.com/smRhs.png instead they are creating their archetype.
IntelliJ gives me an error when I try this method and tells me the Desired archetype does not exist.
So, why I'm facing this error? or how can I create a Spark-Scala project with Maven on IntelliJ 2022?
I selected the archetypes on İntelliJ but it's not working because I need to add some dependencies to my pom.xml. When I add my dependencies on pom.xml, I face a lot of mistakes.
I have been dealing with this problem for 5 days and I couldn't open the Scala-Maven project with Spark.
Also, I read a lot of articles about this and those methods didn't work too.
Also, you can check the dependencies which are I want to add to pom.xml
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.13.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.13</artifactId>
<version>3.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.13</artifactId>
<version>3.3.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.13</artifactId>
<version>3.3.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scalatest/scalatest -->
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.13</artifactId>
<version>3.3.0-SNAP3</version>
<scope>test</scope>
</dependency>
`src/main/main.scala`
`src/main/test.scala`
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.7.1</version>
</plugin>
</plugins>

Maven package error using geospark library

Currently, I am working on geospatial analytics use case and I am using spark 2.4.0 along with geospark library.When I am trying to create the application jar file using eclipse it is giving me the below error.Could you please help me to resolve the below maven dependency error?
Maven File:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>2.4.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.3.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>2.4.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.datasyslab/geospark -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark</artifactId>
<version>1.2.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.datasyslab/geospark-sql -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-sql_2.3</artifactId>
<version>1.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.datasyslab/geospark-viz -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-viz_2.3</artifactId>
<version>1.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kudu/kudu-client -->
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-client</artifactId>
<version>1.6.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kudu/kudu-spark2 -->
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-spark2_2.11</artifactId>
<version>1.6.0</version>
<scope>test</scope>
</dependency>
Error:
geospark-sql_2.3-1.2.0.jar of GFence build path is cross-compiled with an incompatible version of Scala (2.3.0). In case this report is mistaken, this check can be disabled in the compiler preference page.
Description Resource Path Location Type
geospark-viz_2.3-1.2.0.jar of GFence build path is cross-compiled with an incompatible version of Scala (2.3.0). In case this report is mistaken, this check can be disabled in the compiler preference page.
kudu-spark2_2.11-1.6.0.jar of GFence build path is cross-compiled with an incompatible version of Scala (2.11.0). In case this report is mistaken, this check can be disabled in the compiler preference page.
GeoSpark version = 1.2.0
Apache Spark version = 2.4.0
JRE version = 1.8.0
API type = Scala
Thanks,
Sumit
Go to Windows -> Preferences
Expand Scala preferences & select Compiler.
In Compiler preferences tab go to Build manager & uncheck withVersionClasspathValidator check box.
Screenshot of the same::

JDK 11 with JAXB and JAXWS works with Eclipse but not IntelliJ

I am converting an application that uses JAXB and JAX-WS from JDK 8 to JDK 11. The code runs when I use Eclipse IDE but exactly the same code fails with IntelliJ IDEA
I have created a Maven project using both Eclipse and IntelliJ IDEA. The problems of finding a working combination of Maven resources has been described in another question. JDK 11 with JAXB and JAXWS problems
The code builds without error in both environments. I have tried creating the IntelliJ IDEA project as a Maven project as well as a standard IDEA project
part of pom.xl
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11.0.2</version>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-fxml</artifactId>
<version>11.0.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<version>2.3.0</version>
</dependency>
<!-- JAXWS for Java 11 -->
<dependency>
<groupId>com.sun.xml.ws</groupId>
<artifactId>rt</artifactId>
<version>2.3.1</version>
</dependency>
module-info.java
module org.openfx.gustfx {
requires javafx.controls;
requires javafx.fxml;
requires transitive javafx.graphics;
requires java.xml.bind;
requires java.xml.ws;
requires javax.jws;
opens com.agile.ws.schema.common.v1.jaxws to javafx.fxml;
opens org.openfx.gustfx to javafx.fxml;
exports org.openfx.gustfx;
}
When the code is run from Eclipse, there are no errors.
Running the same code from IntelliJ IDE results in this error
java.lang.ClassNotFoundException: com.sun.xml.internal.ws.spi.ProviderImpl
Searching through the jar files confirms that ProviderImpl.class is now located in com.sun.ws.spi not in com.sun.xml.internal.ws.spi This does not cause a problem with eclipse but IDEA reports the ClassNotFoundException
Therefore, my question "How does eclipse resolve this problem while IntelliJ does not ?"
With help from Roman Shevchenko at IntelliJ, I have solved this problem using the following pom.xml
<dependency>
<groupId>com.sun.xml.ws</groupId>
<artifactId>jaxws-rt</artifactId>
<version>2.3.2</version>
</dependency>
<dependency>
<groupId>javax.jws</groupId>
<artifactId>javax.jws-api</artifactId>
<version>1.1</version>
</dependency>
and module-info.java
requires java.xml.ws;
requires java.xml.bind;
requires javax.jws;

Intellij IDEA java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object

I have a following function:
def removeLast(list: List[Int]): List[Int] = list match {
case List() => List()
case List(x) => List()
case x :: xs => x :: removeLast(xs)
}
When I define it and use it from the sbt console everything works just fine.
But when I create a worksheet in Intellij IDEA and try to run it then the following exception appears:
java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
at week5.A$A26$A$A26.removeLast(lists.sc8362409100671270508.tmp:30)
at #worksheet#.#worksheet#(lists.sc8362409100671270508.tmp:33)
In addition, when I change last line to:
case x :: xs => 1 :: removeLast(xs)}
then it works.
What might the problem be?
I had this issue. Agree with Andrzej, idea uses its own compiler, so you have to disable it somehow.
Go to Settings->Scala->Worksheet and uncheck "Run worksheet in the compiler process".
Any answer wasn't usefull in my case. Still i found a solution which worked for me..
It was problem with scalatest version. In pom.xml uprade to
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
helped
So, although the above didn't solve my problem it is related to intellij.
Basically, it was preferring the Scala SDK to resolve the Class::method instead of loading from the dependencies.
I used
-verbose:class
in the JVM switch to have it show me where it was looking; immediately cluing me into it being because it's trying to load the class from the Scala SDK (it would expect it to pull in libs from Maven).
I literally just deleted the Scala SDK from my project settings and the problem went away. So far, my experience with Scala (and definitely in a mixed Java environment) leads me to believe it has a ways to go to mature. This is such a fundamental class/method I can't believe it vanished between versions. The scala version I had installed was 2.11. Apparently what get's pulled in is 2.10.4 from maven.
Anytime you see "NoSuchMethodError" it always means there is a version conflict; it's a question of why.
Like others said here, I was having the same problem due I had some libraries using 2.10 in spite of having scalatest at 2.11.
<!-- http://www.scalactic.org/ -->
<dependency>
<groupId>org.scalactic</groupId>
<artifactId>scalactic_2.11</artifactId>
<version>${scalactic.version}</version>
<scope>test</scope>
</dependency>
<!-- http://www.scalatest.org/ -->
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>${scalactic.version}</version>
<scope>test</scope>
</dependency>
Chech that all libraries that you are using are in same Scala version
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
To
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
Having as properties
<properties>
<scala.tools.version>2.11.8</scala.tools.version>
<scala.version>2.11.8</scala.version>
<scalactic.version>3.0.0</scalactic.version>
<!-- Library Versions -->
<spark.version>2.0.0</spark.version>
....
</properties>
Error
java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object
Reason
This error is specifically due to version mismatch between spark and scala. I faced the error while I was using spark 2.2.0 and scala 2.10.6. Then I changed to different scala versions but I got no success.
Resolution
This error is resolved only when I changed the scala version to 2.11.6 . This version was a perfect match for spark 2.2.0. May be you can try higher versions of scala for the same issue , but I tried for 2.12.x but didnt work.
Suggestion
Request you to set the below versions before doing any coding:
spark - 2.2.0
scala - 2.11.6
Also I used the below pom :
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
I just encountered the same problem. Turned out that I had downloaded the wrong version of Akka which included scala-library-2.10.x, while my project uses 2.11.6. Grabbing the latest version of Akka, which includes 2.11.5, solved the problem.
So, it seems this is a compatibility issue, so I would check dependencies in the future.
I solved this by set my scala sdk version in my project from 2.12 to 2.11.
it is version problem, just use scala sdk version to 2.11.
I have the same problem. When you change it to using map function it works! I don't know why but thats how to fix it.
I have found that this can be caused by having differing versions of scalatest and scalamock. The Following Maven
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId><!-- this was previously 2.10 -->
<version>2.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalamock</groupId>
<artifactId>scalamock-scalatest-support_2.11</artifactId>
<version>3.2</version>
<scope>test</scope>
</dependency>
I had the same thing when adding json4. i solved it by changing the artifactId from json4s-native_2.12 to - json4s-native_2.11.
I guess this is related to the scala version you are using mine was 2.11 and not 2.12 (you can see yours in the properties xml node in the pom.xml file, mine is : <scala.version>2.11</scala.version>.)