How to compile Kotlin to include jar of Java with command line? - command-line

I want to include jar of Java to Kotlin. I tried like below, but I had error.
javac -encoding utf-8 javasorce/test/JavaTestClass.java
jar cvf javasorce/test/JavaTestClass.jar javasorce/test/JavaTestClass.class
kotlinc kotlin/CallJavaTestClass.kt -cp javasorce/test/JavaTestClass.jar -include-runtime -d kotlin/CallJavaTestClass.jar
java -jar kotlin/CallJavaTestClass.jar
The error is:
Exception in thread "main" java.lang.NoClassDefFoundError:
javasorce/test/JavaTestClass at CallJavaTestClassKt.main(CallJavaTestClass.kt:5)
Caused by: java.lang.ClassNotFoundException: javasorce.test.JavaTestClass
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
I am using like below directories:
root
|-javasorce
| |-test
| |-JavaTestClass.java
|-kotlin
|-CallJavaTestClass.kt
Please tell me if there is solution.

In addition to compiling the source code with the Java library on the classpath, you need to run the program with the same library on the classpath: if a class is there at compile time, you need it on the classpath at run time as well to be able to use it.
The correct way of running an application which has its classes scattered across several JARs is to pass those JARs as the classpath to java and to additionally specify the class that has the main function:
java CallJavaTestClassKt -cp kotlin/CallJavaTestClass.jar:javasorce/test/JavaTestClass.jar
The command above assumes that you placed the main function on top level of CallJavaTestClass.kt (in this case, the class name is formed by the file name with .kt replaced by Kt), and it has no package ... declaration. If you have a package, prepend it to the class name as com.example.FileNameKt. If you declare main in an object or a companion object, use the class name or the object name (without Kt) instead of CallJavaTestClassKt.
See also: How to run Kotlin class from the command line?

Related

Running scala code using java -jar <jarfile>

I am trying to run scala code using java -jar <> i am getting below issue
ERROR:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataOutputStream at com.cargill.finance.cdp.blackline.Ingest.main(Ingest.scala) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataOutputStream
The same code is running fine with spark-submit.
I am trying to write data to hdfs file.
I have imported below classes
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.FileSystem
import org.apache.hadoop.fs.Path
import org.apache.hadoop.fs.FSDataOutputStream
You need to add all dependencies (including transitive dependencies, i.e. dependencies of dependencies) to -cp argument. If you just look at direct dependencies of hadoop-core you'll see why you should never do this manually. Instead use a build system. If you followed e.g. https://spark.apache.org/docs/latest/quick-start.html it actually sets up SBT, so you can do sbt run to run the main class like java -cp <lots of libraries> -jar <jarfile> would). If you didn't, add build.sbt as described there.

ClassNotFoundException for Spark job on Yarn-cluster mode

So I am trying to run a Spark job on Yarn-cluster mode kicked off via Oozie workflow, but have been encountering the following error (relevant stacktrace below)
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:388)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:296)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:179)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1917)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
...
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:414)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:323)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:294)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 33 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2317)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:688)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
... 38 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
... 42 more
Some background information:
The job runs on spark 1.4.1 (specified correct spark.yarn.jar field in the spark.conf file).
oozie.libpath is set to the hdfs directory in which the jar of my program resides.
org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory, the class not found, exists in phoenix-4.5.1-HBase-1.0-client.jar. I've specified this jar in spark.driver.extraClassPath and spark.executor.extraClassPath in my spark.conf file. I've also added the phoenix-core dependency in my pom file, so that the class exists in my shaded project jar as well.
Observations so far:
adding an extra field in my spark.conf file spark.driver.userClassPathFirst and setting it to true gets rid of the classnotfound exception. However, it also prevents me from initializing a spark context (null pointer exception). From googling around it seems that including this field messes up classpaths, so may not be the way to go about it since I cannot even initialize a spark context this way.
I noticed that in the oozie stdout log, I do not see the classpath of the phoenix jar. So maybe for some reason spark.driver.extraClassPath and spark.executor.extraClassPath aren't actually picking up the jar as an extraClassPath? I do know that I'm specifying the correct jar file path, as other jobs have spark.conf files with the same parameters.
I found a hacky way to make the phoenix jar show up in the classpath (in the oozie stdout log) by copying the jar to the same directory as where my program jar resides. This works whether or not spark.executor.extraClassPath is changed to point to the new jar location. However, the classnotfound exception persists, even though I clearly see the ClientRpcControllerFactory jar when I unzip the jar)
Other things I've tried:
I tried using the sparkConf.setJars() and sparkContext.addJar() methods, but still encountered the same error
added the jar in the spark.driver.extraClassPath field in my job properties file, but it hasn't seemed to help (Spark docs indicated that this field is necessary when running in client mode, so may not be relevant for my case)
Any help/ideas/suggestions would be greatly appreciated.
I use CDH 5.5.1 + Phoenix 4.5.2 (both installed with parcels) and faced the same problem. I think the problem disappeared after I switched to client mode. I can't verify this because I am getting other error with cluster mode now.
I tried to trace Phoenix source code and found some interesting things. Hope Java / Scala expert identify the root cause.
The PhoenixDriver class was loaded.
This showed the jar was found initially. After layers of Class Loader /
context switch (?), the jar lost from the classpath.
If I Class.forName() a non-existing class in my program, there is no need to call sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331). The stack is like:
java.lang.ClassNotFoundException: NONEXISTINGCLASS
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
I copied Phoenix code into my program for testing. I still get the ClassNotFoundExcpetion if I call ConnectionQueryServicesImpl.init (ConnectionQueryServicesImpl.java:1896). However, a call to ConnectionQueryServicesImpl.openConnection (ConnectionQueryServicesImpl.java:296) returned usable HBase connection. So it seems PhoenixContextExecutor was causing the loss of the jar, but I don't know how.
Source code of Cloudera Phoenix 4.5.2 : https://github.com/cloudera-labs/phoenix/blob/phoenix1-4.5.2_1.2.0/phoenix-core/src/main/java/org/apache/
(Not sure whether I should post a comment... but I have no reputation anyway)
So I managed to fix my issue and get my job to run. My solution is very hacky, but will post it here in case it may help others in the future.
Basically, the problem as I understand it was that the org.apache.hadoop.hbase.util.ReflectionUtils class, which is responsible for finding the ClientRpcControllerFactory class, was being loaded from some cloudera directory in the cluster instead of from my own jar. When I set spark.driver.userClassPathFirst to true, it prioritized loading the ReflectionUtils class from my jar, and so was able to location the ClientRpcControllerFactory class. But that messed up some other classpaths and kept giving me a NullPointerException when I tried to initialize a SparkContext, so I looked for another solution.
I tried to figure out if it was possible to exclude all default cdh jars from being included in my classpath, but found that the value in spark.yarn.jar was pulling in all these cdh jars, and I definitely needed to specify that jar.
So the solution was to include all classes under org.apache.hadoop.hbase from the Phoenix jar into spark-assembly jar (the jar that spark.yarn.jar pointed to), which got rid of the original exception and did not give me a NPE when trying to initialize a SparkContext. I found that now the ReflectionUtils class was being loaded from the spark-assembly jar, and since the ClientRpcControllerFactorywas also included in that jar, it was able to find it. After this, I encountered a few more classNotFoundExceptions for Phoenix classes, so I put those classes into the spark-assembly jar as well.
Finally, I had a java.lang.RuntimeException: hbase-default.xml File Seems to be for and old Version of HBase problem. I found that my application jar contained such a file, but changing hbase.defaults.for.version.skip to true didn't do anything. So I included another hbase-default.xml file in the spark-assembly jar with the skip flag to true, and it finally worked.
Some observations:
I noticed that my spark-assembly jar was completely missing an org.apache.hadoop.hbase directory. A coworker told me that usually I should expect to find an hbase directory in my spark-assembly jar, so maybe I was working with a bad spark-assembly jar. Edit: I checked a spark-assembly jar that I newly downloaded (v1.5.2) and it doesn't have it, so maybe the apache.hadoop.hbase package is not included in it.
ClassNotFoundExceptions and classloader problems are hard to debug.

Running tests in IntelliJ ClassNotFoundException

I tried many different run configs, but whatever I do I get this exception when running specs2 tests in IntelliJ for scala.
It always fails to find a class that ends with a $ sign. I checked - and there really is no such class file. There's AppControllerIT.class and lots of classes like AppControllerIT$innerFunctionOrclass.clas, but not AppControllerIT$.class
Any ideas?
Thanks!
com.haha.market.api.e2e.controllers.AppControllerIT$
java.lang.ClassNotFoundException: com.haha.market.api.e2e.controllers.AppControllerIT$
STACKTRACE
java.net.URLClassLoader.findClass(URLClassLoader.java:381)
java.lang.ClassLoader.loadClass(ClassLoader.java:424)
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
org.specs2.reflect.Classes$$anonfun$loadClassEither$1.apply(Classes.scala:140)
org.specs2.reflect.Classes$$anonfun$loadClassEither$1.apply(Classes.scala:140)
org.specs2.control.ActionT$$anonfun$safe$1.apply(ActionT.scala:89)
org.specs2.control.ActionT$$anonfun$reader$1$$anonfun$apply$6.apply(ActionT.scala:80)
org.specs2.control.Status$.safe(Status.scala:100)
Classes with $ signs at the end are generate from compiled Scala objects. This means you may have an object defined similar to this:
package com.haha.market.api.e2e.controllers
object AppControllerIT {
}
From your error, it seems that an older compiled artifact or a library (?) is polluting your classpath. First, try cleaning up the project (mvn clean or sbt clean). Next, try to clean any libraries you have in your project inside IntelliJ. IntelliJ sometimes caches multiple versions of the same libraries which may cause confusion during runtime. To clean those up go to "File -> Project Structure" in IntelliJ and manually delete any duplicated libraries you may have.

run Scala code in package (LinkedIn Norbert)

I am trying to run a Scala object called NorbertClusterClientMain which is in package com.linkedin.norbert.cluster. The source code for it is in folder examples/src/main/scala of rhavyn's open-source branch of LinkedIn Norbert, and I am working on a Linux command line.
Although I've been told that running Scala code in a package is like running Java in a package, I am in examples/src/main/scala but cannot use this command:
$ scala com.linkedin.norbert.cluster.NorbertClusterClientMain
I am getting "No such file or class on classpath", even though the file exists.
I was successfully able to compile Norbert with
$ mvn clean -DskipTests install
How can I run the NorbertClusterClientMain? Please let me know. I appreciate your help.
It is the same. So, in this case, it is looking for this file:
./com/linkedin/norbert/cluster/NorbertClusterClientMain.class
This is how Java works, and since "running" a Scala program is just running java passing the Scala library in the classpath, it has to be the same.
How did you compile it, by the way? Nevermind, saw your comment. At the directory you ran mvn, you should probably be able to run it like this:
scala -cp target com.linkedin.norbert.cluster.NorbertClusterClientMain
Failing that, find the class file, and pass the directory where com/ is to the classpath.
Your mvn script produced JAR and class files within target directory:
./target/com/.../<someClassName1>.class
./target/com/.../<someClassName2>.class
... etc
./target/<someJarName1>.jar
./target/<someJarName2>.jar
... etc
Great! Now do the same thing that you must do for java; include in your classpath:
the target base directory (this "picks up" all class files in directory hierarchy beneath target)
each jar file (this "picks up" all class files in directory hierarchy within each JAR)
scala -cp target:target/<someJarName1>.jar:target/<someJarName2>.jar:... etc ./com.linkedin.norbert.cluster.NorbertClusterClientMain
Here -cp (or equivalently, CLASSPATH environment variable) is the java classpath and so has the same syntax and rules as java.
BTW: "sbt" is a standard, powerful, usable way to build scala projects. It uses Ivy to "pull" dependencies from code repositories (i.e. mvn++). The best way to get started with it is to download sbt example projects, search for "sbt tutorial" blogs and read the sbt docs. :)
I don't anything about the code base, but that class is in the examples subproject.
This shows that it loads normally. (I haven't configured anything, because I don't know anything about the code base.)
apm#mara:~/clones/norbert$ cd examples
/home/apm/clones/norbert/examples
apm#mara:~/clones/norbert/examples$ ls
pom.xml src target
apm#mara:~/clones/norbert/examples$ mvn exec:java -Dexec.mainClass=com.linkedin.norbert.cluster.NorbertClusterClientMain
[INFO] Scanning for projects...
<snip...>
[INFO] --- exec-maven-plugin:1.2.1:java (default-cli) # norbert-examples ---
[WARNING]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:297)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
at com.linkedin.norbert.cluster.NorbertClusterClientMain$.main(NorbertClusterClientMain.scala:22)
at com.linkedin.norbert.cluster.NorbertClusterClientMain.main(NorbertClusterClientMain.scala)
... 6 more
Here's how to add args:
https://stackoverflow.com/a/9846103/1296806
I don't use maven much anymore.
Edit: I don't use Scala 2.7.7 much anymore either.
try $ scala ./com.linkedin.norbert.cluster.NorbertClusterClientMain
or -cp .
you "current directory" might not be in classpath
Norbert uses a Scala 2.7 so directly using Scala from CLI may not work for you. Therefore find all the dependency jar using Maven and use it.
This is how I did it.
First, check out the code:
$ git clone https://github.com/rhavyn/norbert
$ cd norbert/
Build and install the dependencies in local repository first:
$ mvn clean install
Setup a classpath variable which we will use later for examples/ folder:
$ cd examples/
$ export CP=$(mvn dependency:build-classpath | grep -A1 'Dependencies classpath:' | tail -1)
Run server from examples/ folder:
$ java -cp $CP:target/classes com.linkedin.norbert.network.javaapi.NorbertJavaNetworkServerMain arg0 arg1
log4j:ERROR Could not find value for key log4j.appender.R
log4j:ERROR Could not instantiate appender named "R".
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 2
at com.linkedin.norbert.network.javaapi.NorbertJavaNetworkServerMain.main(NorbertJavaNetworkServerMain.java:33)
Run client from examples/ folder:
$ java -cp $CP:target/classes com.linkedin.norbert.cluster.NorbertClusterClientMain localhost 1011
log4j:ERROR Could not find value for key log4j.appender.R
log4j:ERROR Could not instantiate appender named "R".
> h2013-12-20 13:59:44,323 - WARN [pool-1-thread-2-SendThread(0.0.3.243:2181):ClientCnxn$SendThread#1120] - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.SocketException: Invalid argument
at sun.nio.ch.Net.connect0(Native Method)
at sun.nio.ch.Net.connect(Net.java:364)
at sun.nio.ch.Net.connect(Net.java:356)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:623)
at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1009)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1036)
2013-12-20 13:59:46,192 - WARN [pool-1-thread-2-SendThread(0.0.3.243:2181):ClientCnxn$SendThread#1120] - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.SocketException: Invalid argument
at sun.nio.ch.Net.connect0(Native Method)
at sun.nio.ch.Net.connect(Net.java:364)
at sun.nio.ch.Net.connect(Net.java:356)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:623)
at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1009)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1036)
Thats it. Now all you have to do is setup the required services like Zookeper etc.

Problem creating an executable jar from scala file

I'm tryign to export my project as a jar with IntelliJ 9.0. My project compiles and runs with no problem in Intellij, but when I write it to a .jar and open it, it will show an error.
My Main class is something like:
package Main
//Imports
object Main{
def main(args: Array[String]) {
println("Main: Hello, world!")
//do stuff
}
Now, in the artifacts window I created a .jar with the following:
Main Clas: Main.Main
Class Path: lib/javacsv lib/scala-compiler.jar lib/scala-library.jar lib/scalatest-1.0-test.jar lib/scalatest-1.0.jar lib/tools.jar lib/jtds-1.2.2.jar lib/flex-messaging-common.jar lib/flex-messaging-core.jar lib/spring.jar lib/mysql-connector-java-5.1.7-bin.jar lib/ojdbc14.jar lib/commons-logging.jar lib/postgresql-8.4-701.jdbc3.jar lib/log4j-1.2.15.jar lib/poi-3.6-20091214.jar lib/poi-ooxml-3.6-20091214.jar lib/dom4j-1.6.1.jar lib/poi-ooxml-schemas-3.6-20091214.jar lib/geronimo-stax-api_1.0_spec-1.0.jar lib/xmlbeans-2.3.0.jar lib/rt.jar lib/ifxjdbc.jar lib/db2jcc4.jar
I have double checked that all those classes are on the project and are the onnly classes on it. Notice that it inlcudes lib/scala-compiler.jar and lib/scala-library.jar.
Build the project: java -jar myScalaApp.jar and I get:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/ScalaObject
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:675)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:316)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:288)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:374)
at Main.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: scala.ScalaObject
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:316)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:288)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:374)
... 13 more
I understand that it's not finding the scala classes but I made sure they are there. What else can be the problem and how could I fix it?
I'd say one of two things has gone wrong:
scala-library.jar is not in your jar; or
the class path at runtime doesn't include scala-library.jar.
I don't know how IntelliJ builds your jar: Does it unpack all your library jars and mung them together with your code into one big jar, or does it add the library jars as-is to your big jar and manipulate the classpath to get at the jars-within-jar?
The first point is easy enough to check: Either use jar -tvf yourJar.jar to list out the contents of your jar, or use an archive viewer to look into it graphically. Note that a .jar is basically a .zip, so you can rename the extension and then use a tool that can look into .zip files.
Just had an idea about the second part: Can you build a simple Java main class that prints out System.getProperty("java.class.path") ? A Java class should be able to run in that jar even if a Scala class isn't.
There are at least three ways to do this. (One applies to Netbeans but should work in Idea too)
Use jarjar as explained here
Use the idea on this page (not so elegant but the easiest) - essentially unpack the Scala directory from scala-library.jar and add this directory to your jar.
Use the idea on this page (for NB) - essentially add "scala-library.jar" to your project libraries.