java.lang.NoSuchMethodError in Scala SBT Shell after executing JAR [duplicate] - scala

This question already has answers here:
How to run jar generated by package (possibly with other jars under lib)?
(3 answers)
Closed 2 years ago.
after packaging and executing a Scala application (build.sbt with version 2.12.0, but in fact having 2.13.3 installed) with SBT (version 1.3.13), I get the following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.wrapRefArray([Ljava/lang/Object;)Lscala/collection/mutable/WrappedArray;
at org.example.GreetWorld$.printMessage(GreetWorld.scala:5)
The source file GreetWorld.scala that caused the error looks like this:
package org.example
object GreetWorld {
def printMessage(theMessage:String):Unit = {
println(s"${theMessage} from me")
}
}
The main file that is invoking the file above looks like this:
package org.example
object HelloWorld {
def main(args: Array[String]) = {
GreetWorld.printMessage("Hello")
}
}
Does anybody know the root cause? At first I thought it has to do with the SBT shell picking Java 11, but even after changing my Windows' JAVA_HOME to Java 8, I still get the same error. Compiling and running it in SBT Shell works fine. Only the JAR execution fails.

The error is pretty simple. When you run package you create a JAR which only contains the classes corresponding to your source code, nothing more. And your code depends on the Scala stdlib, so if you try to run it with java - jar it will fail with a class path error.
You have 4 solutions:
Run the JAR using scala directly. However, you need to use the same major version it was used to compile.
Put the Scala library jar in the classpath when running. This is basically the same as above. Thus again, you have to use the same major version.
Create an uber jar that already has the Scala stdlib (as well as any other dependency) in it, using sbt-assembly.
Create a native distributable that will set up everything for you, using sbt-native-packager.
For local development and testing option 1 is usually the best one.
For simple projects option 3 is, IMHO, the simplest alternative.
And for very complex projects option 4 is very popular.

Related

Scala sbt tests: "No configuration setting found for key 'akka'" after switching to Java 11

After switching to Java 11, sbt tests started to fail with the exception "No configuration setting found for key 'akka'". We are using sbt assembly plugin on the project but since the tests are run not inside jar but using sbt <module_name>/test, looks like there are some issues with building test resources/paths.
List of things I did:
Added concat MergeStrategy for 'reference.conf' file. Jar that is assembled has all files inside and there are no issues with that, so it is only a test step issue.
Checked class path for tests using 'export <module_name>/test:fullClasspath' in sbt before and after migrating to Java 11. They are the same.
Everything else except of a test where I create actor system works good.
Code that creates actor system:
object MyObjectTest extends AsyncFunSuite {
private implicit val system: ActorSystem = ActorSystem("MyActorSystem")
private implicit val ex: ExecutionContext = system.dispatcher
}
At the moment I do not have any ideas what should I check next, so any suggestions would be appreciated.
P.S. running tests inside IDEA works good, but we have CI/CD job that runs tests using sbt test command.
I guess there are some issues when you are using sbt + Java 11. I tried to reproduce this error using a small project with the same configuration but was not been able to do it. However, the issue was fixed using the following option in sbt:
Test / fork := true
According to sbt documentation, this property creates a separate JVM for running all tests. I think all configurations have been set up correctly for another JVM.
Hope this answer will help someone in future.

Packaged jar from scala code not executing properly [duplicate]

This question already has answers here:
running scala apps with java -jar
(5 answers)
Closed 8 months ago.
I'm trying to figure out, how I can execute Scala Code as an executable jar file.
So I've loaded a Hello World Script which just hast this code
object Main extends App {
println("Hello, World!")
}
This is located within src\main\scala\Main.scala.
Now im running sbt package to create a jar file.
However, I'm not able to run this file, it just won't do anything.
Also, by going in the directory and running it via java -jar HelloWorld.jar is not working and I'm just getting the Error:
Error: Unable to initialize main class Main
Caused by: java.lang.NoClassDefFoundError: scala/Function0
However I have no clue why this happens and how to fix it.
Based on the error:
Caused by: java.lang.NoClassDefFoundError: scala/Function0
I think you do not have Scala runtime libraries on the classpath.
You either need to provide a complete class-path with all libraries, or to use sbt assembly plugin to create a fat JAR.

Cannot sync scala project in IntelliJ IDEA Community 2019.2.4

Scala newbie here: I am attempting to get started with Scala using Windows 10 (Pro 10.0.18362 Build 18362) Hyper-v Quick Create of Ubuntu (18.04.3 LTS). I installed the JRE and JDK (11.0.4). I installed IntelliJ IDEA 2019.2.4. I added the Plugin for Scala (plugin 2019.2.37). I have left the SBT Executor 1.2.1 disabled for now: it was enabled earlier but it does not seem to affect the results. I tried to create the HelloWorld application (see below). I added the Scala Framework to the Project and, after encountering the error below, the Mavin framework (adding Mavin did not help.) After correcting the error in bulid.sbt it looks like this:
import com.sun.tools.javac.resources.version
name := "HelloWorld"
version := "0.1"
scalaVersion := "2.13.1"
I create a Scala worksheet by right-click on scala folder and selecting Scala Worksheet:
object Hello extends App {
println("Hello, World!")
}
I get a pop-up saying Maven project needs to be imported. This succeeds quickly. I get a second one saying sbt project needs to be imported. This fails:
sbt.librarymanagement.ResolveException: Error downloading org.scala-sbt:zinc-compile-core_2.12:1.3.1
and this error accompanies it:
not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.scala-sbt/util-position_2.12/1.3.2/ivys/ivy.xml
The first ResolveException appears to be about a part of Mavin, but that should be present. This error message is followed by many other "not found" errors, but I assume they all stem from the above errors or are related. I cannot seem to find a solution: most of the information and examples on IntelliJ IDEA Scala on the web are several years (editions) out of date.
It appears Maven apps are to be deployed to Apache Spark. Not my intention, but in the Mavin panel I can successfully clean, validate and compile.
When I Run the Hello.sc app, I get this:
/snap/intellij-idea-community/185/jbr/bin/java -javaagent:/snap/intellij-idea-community/185/lib/idea_rt.jar=37033:/snap/intellij-idea-community/185/bin -Dfile.encoding=UTF-8 -classpath /home/perfwise/ideaProjects/HelloWorld/target/classes:/home/perfwise/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.13.1.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.13.1.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-library/srcs/scala-library-2.13.1-sources.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-reflect/srcs/scala-reflect-2.13.1-sources.jar Hello
Error: Could not find or load main class Hello
Caused by: java.lang.ClassNotFoundException: Hello
Process finished with exit code 1
I would have expected this to work. Any pointers will be much appreciated.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD

Please note that I am better dataminer than programmer.
I am trying to run examples from book "Advanced analytics with Spark" from author Sandy Ryza (these code examples can be downloaded from "https://github.com/sryza/aas"),
and I run into following problem.
When I open this project in Intelij Idea and try to run it, I get error "Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD"
Does anyone know how to solve this issue ?
Does this mean i am using wrong version of spark ?
First when I tried to run this code, I got error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/product", but I solved it by setting scala-lib to compile in maven.
I use Maven 3.3.9, Java 1.7.0_79 and scala 2.11.7 , spark 1.6.1. I tried both Intelij Idea 14 and 15 different versions of java (1.7), scala (2.10) and spark, but to no success.
I am also using windows 7.
My SPARK_HOME and Path variables are set, and i can execute spark-shell from command line.
The examples in this book will show a --master argument to sparkshell, but you will need to specify arguments as appropriate for your environment. If you don’t have Hadoop installed you need to start the spark-shell locally. To execute the sample you can simply pass paths to local file reference (file:///), rather than a HDFS reference (hdfs://)
The author suggest an hybrid development approach:
Keep the frontier of development in the REPL, and, as pieces of code
harden, move them over into a compiled library.
Hence the samples code are considered as compiled libraries rather than standalone application. You can make the compiled JAR available to spark-shell by passing it to the --jars property, while maven is used for compiling and managing dependencies.
In the book the author describes how the simplesparkproject can be executed:
use maven to compile and package the project
cd simplesparkproject/
mvn package
start the spark-shell with the jar dependencies
spark-shell --master local[2] --driver-memory 2g --jars ../simplesparkproject-0.0.1.jar ../README.md
Then you can access you object within the spark-shell as follows:
val myApp = com.cloudera.datascience.MyApp
However if you want to execute the sample code as Standalone application and execute it within idea you need to modify the pom.xml.
Some of dependencies are required for compilation, but are available in an spark runtime environment. Therefore these dependencies are marked with scope provided in the pom.xml.
<!--<scope>provided</scope>-->
you can remake the provided scope, than you will be able to run the samples within idea. But you can not provide this jar as dependency for the spark shell anymore.
Note: using maven 3.0.5 and Java 7+. I had problems with maven 3.3.X version with the plugin versions.

opencv 3.0.0 java imread_0 undefined

I am trying to develop an application using java opencv 3.0.0-beta using scala.
I am getting a runtime error:
java.lang.UnsatisfiedLinkError: java.lang.UnsatisfiedLinkError: org.opencv.imgcodecs.Imgcodecs.imread_1(Ljava/lang/String;)J
While researching the cause i have created the following simple application the exhibits similar behaviour:
import reflect._
import org.opencv.core.Core
import org.opencv.core.Mat
import org.opencv.core.CvType
import org.opencv.imgcodecs.Imgcodecs
object main extends Application {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
val what = "something.png"
val mat = Imgcodecs.imread(what)
Imgcodecs.imwrite("something_else.png", mat)
}
The major difference is that, if run as "sbt run" it performs as expected. if the appropriate lines are removed from the above the code fails in REPL.
I suspect that this issue is related to the original issue, but have no proof.
If i look at the memory map of the JVM in both cases i have the expected libs loaded.
If the code is inspected i find no definition of org.opencv.imgcodecs.Imgcodecs.imread_1
I am quite lost as to where to go next in diagnosing this issue.
Is there anyone who has come across this issue?
Thanks
i haven't used openCV3.0 yet, as it has major changes and breaks opencv 2.4.x code , are you supplying the library path to
sbt run
add
javaOptions in run += "-Djava.library.path=lib/opencv/"
to your build.sbt file or pass on cmd line
sbt run -Djava.library.path=lib/opencv/
opencv folder should have your files that gets generated along with your jar file
i have java bindins for 2.4.9 , 2.4.10 and 3.0.0 for java 7 and 8 in this git repo if you need them
git#gitlab.com:opencv/java_lib.git