This question already has answers here:
running scala apps with java -jar
(5 answers)
Closed 8 months ago.
I'm trying to figure out, how I can execute Scala Code as an executable jar file.
So I've loaded a Hello World Script which just hast this code
object Main extends App {
println("Hello, World!")
}
This is located within src\main\scala\Main.scala.
Now im running sbt package to create a jar file.
However, I'm not able to run this file, it just won't do anything.
Also, by going in the directory and running it via java -jar HelloWorld.jar is not working and I'm just getting the Error:
Error: Unable to initialize main class Main
Caused by: java.lang.NoClassDefFoundError: scala/Function0
However I have no clue why this happens and how to fix it.
Based on the error:
Caused by: java.lang.NoClassDefFoundError: scala/Function0
I think you do not have Scala runtime libraries on the classpath.
You either need to provide a complete class-path with all libraries, or to use sbt assembly plugin to create a fat JAR.
Related
This question already has answers here:
How to run jar generated by package (possibly with other jars under lib)?
(3 answers)
Closed 2 years ago.
after packaging and executing a Scala application (build.sbt with version 2.12.0, but in fact having 2.13.3 installed) with SBT (version 1.3.13), I get the following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.wrapRefArray([Ljava/lang/Object;)Lscala/collection/mutable/WrappedArray;
at org.example.GreetWorld$.printMessage(GreetWorld.scala:5)
The source file GreetWorld.scala that caused the error looks like this:
package org.example
object GreetWorld {
def printMessage(theMessage:String):Unit = {
println(s"${theMessage} from me")
}
}
The main file that is invoking the file above looks like this:
package org.example
object HelloWorld {
def main(args: Array[String]) = {
GreetWorld.printMessage("Hello")
}
}
Does anybody know the root cause? At first I thought it has to do with the SBT shell picking Java 11, but even after changing my Windows' JAVA_HOME to Java 8, I still get the same error. Compiling and running it in SBT Shell works fine. Only the JAR execution fails.
The error is pretty simple. When you run package you create a JAR which only contains the classes corresponding to your source code, nothing more. And your code depends on the Scala stdlib, so if you try to run it with java - jar it will fail with a class path error.
You have 4 solutions:
Run the JAR using scala directly. However, you need to use the same major version it was used to compile.
Put the Scala library jar in the classpath when running. This is basically the same as above. Thus again, you have to use the same major version.
Create an uber jar that already has the Scala stdlib (as well as any other dependency) in it, using sbt-assembly.
Create a native distributable that will set up everything for you, using sbt-native-packager.
For local development and testing option 1 is usually the best one.
For simple projects option 3 is, IMHO, the simplest alternative.
And for very complex projects option 4 is very popular.
I am trying to do a basic scala HelloWorld in Eclipse 2019 and I am getting an error.
The following is my code and the error it is producing. Can someone please help me address this error in eclipse? Thanks
package hello
object HelloWorld {
def main(args: Array[String]): Unit = {
println("Hello, world!")
}
}
Error:
Exception in thread "main" java.lang.NoClassDefFoundError:
scala/Predef$
at HelloWorld/hello.HelloWorld$.main(HelloWorld.scala:5)
at HelloWorld/hello.HelloWorld.main(HelloWorld.scala)
Caused by: java.lang.ClassNotFoundException: scala.Predef$
at
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 2 more
Scala library is already in source path
There are two required places to set your class-path:
Build/compilation time class-path: (i) Right-click on your project, (ii) Buildpath > Configure BuildPath, (iii) Add Library (or jar), (iv) Select the Scala Library. This one you already have as supported by your screenshot.
Run-time compilation class-path: This needs to be explicitly set in the Run-time configuration to also include the scala library: (i) Run configurations..., (ii) Classpath, (iii) Add Jar and use the scala-library jar. For this option, I have not tested whether User vs Bootstrap matters. Furthermore, I was unable to use the Add Library here, only Add jar results in a functioning run within Eclipse.
The second option is the likely cause of the error you are getting.
You need to add Scala library to your classpath.
From Eclipse:
Right-click on your project
Configure Buildpath
Add Library
Select the Scala Library
Please note that I am better dataminer than programmer.
I am trying to run examples from book "Advanced analytics with Spark" from author Sandy Ryza (these code examples can be downloaded from "https://github.com/sryza/aas"),
and I run into following problem.
When I open this project in Intelij Idea and try to run it, I get error "Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD"
Does anyone know how to solve this issue ?
Does this mean i am using wrong version of spark ?
First when I tried to run this code, I got error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/product", but I solved it by setting scala-lib to compile in maven.
I use Maven 3.3.9, Java 1.7.0_79 and scala 2.11.7 , spark 1.6.1. I tried both Intelij Idea 14 and 15 different versions of java (1.7), scala (2.10) and spark, but to no success.
I am also using windows 7.
My SPARK_HOME and Path variables are set, and i can execute spark-shell from command line.
The examples in this book will show a --master argument to sparkshell, but you will need to specify arguments as appropriate for your environment. If you don’t have Hadoop installed you need to start the spark-shell locally. To execute the sample you can simply pass paths to local file reference (file:///), rather than a HDFS reference (hdfs://)
The author suggest an hybrid development approach:
Keep the frontier of development in the REPL, and, as pieces of code
harden, move them over into a compiled library.
Hence the samples code are considered as compiled libraries rather than standalone application. You can make the compiled JAR available to spark-shell by passing it to the --jars property, while maven is used for compiling and managing dependencies.
In the book the author describes how the simplesparkproject can be executed:
use maven to compile and package the project
cd simplesparkproject/
mvn package
start the spark-shell with the jar dependencies
spark-shell --master local[2] --driver-memory 2g --jars ../simplesparkproject-0.0.1.jar ../README.md
Then you can access you object within the spark-shell as follows:
val myApp = com.cloudera.datascience.MyApp
However if you want to execute the sample code as Standalone application and execute it within idea you need to modify the pom.xml.
Some of dependencies are required for compilation, but are available in an spark runtime environment. Therefore these dependencies are marked with scope provided in the pom.xml.
<!--<scope>provided</scope>-->
you can remake the provided scope, than you will be able to run the samples within idea. But you can not provide this jar as dependency for the spark shell anymore.
Note: using maven 3.0.5 and Java 7+. I had problems with maven 3.3.X version with the plugin versions.
My requirement is to run a eclipse java project in command prompt. I am using following program structure:
project name:parser
package name:xml
class name:Sample (with the main method)
I tried exporting the jar file by right clicking the project name-->export-->jar file-->gave the destination path for jar file->checked the Generate Manifest file and selected the main program(Sample.java).
Now in the command prompt i tried to run the jar file in the cmd using
java -jar myjar.jar.
And the error message I get is:
Exception in thread "main" java.lang.UnsupportedClassVersionError: XML/Sample:
Unsupported major.minor version 51.0
Could not find the main class: XML.Sample. Program will exit.
I have been searching for a solution for a long time but failed to find one. I would be very thankful if someone could help me with this problem. Kindly excuse for any mistakes in my question as I am new to java coding. Thanks in advance
I have a scala .class file that I convert to a jar and try to register to a pig script. It is able to find that class now BUT it throws a ClassNotFoundException for scala.ScalaObject.
I notice that there is a scala.ScalaObject.class entry in the scala-library jar in the littlepiggy/lib folder.
Question 1
Shouldn't this jar be directly accessible anyways? Or do I have to add this path to an equivalent of a CLASSPATH for Pig?
Question 2
After this, I forcefully registered that jar as well.
I got this error:
java.lang.NoSuchMethodError: scala.collection.JavaConversions$.asScalaIterator(Ljava/util/Iterator;)Lscala/collection/Iterator
This doesn't look right to me. Any ideas?
PS - This source suggests that I should include the scala-library jar but Pig should already be able to find it and anyways, its not really working for me.
(http://mehack.com/levenshtein-distance-function-for-pig-and-had-0)
The answer's here. Should have checked out a more exhaustive set of keywords.
NoSuchMethodError when attempting to implicitly convert a java to scala collection
I was using different scala versions to build the class file and then in pig.
Thanks!