how to run scala .jar with external jar files in terminal - scala

I have my .jar built from scala classes and it has an external dependency with other.jar. Please suggest how should I run my jar files in terminal. The command I tried is
$scala my_scala.jar external.jar

It works same way as running java program. Try this
scala -classpath <your_scala_jar>:<external_jar> <package.MainClass>

Related

create project jar in scala

I have a self-contained application in SBT. My data is stored on HDFS (the hadoop file system).How can I get a jar file to run my work on another machine.
The directory of my project is the following:
/MyProject
/target
/scala-2.11
/MyApp_2.11-1.0.jar
/src
/main
/scala
If you don't have any dependencies then running sbt package will create a jar will all your code.
You can then run your Spark app as:
$SPARK_HOME/bin/spark-submit --name "an-app" my-app.jar
If your project has external dependencies (other than spark itself; if it's just Spark or any of it's dependencies, then the above approach still works), then you have two options:
1) Use the sbt assembly plugin to create an uper jar with your entire class-path. Running sbt assembly will create another jar which you can use in the same way as before.
2) If you only have very few simple dependecies (say just joda-time), then you can simply include them into your spark-submit script.
$SPARK_HOME/bin/spark-submit --name "an-app" --packages "joda-time:joda-time:2.9.6" my-app.jar
Unlike Java, in Scala, the file’s package name doesn’t have to match the directory name. In fact, for simple tests like this,
you can place this file in the root directory of your SBT project, if you prefer.
From the root directory of the project, you can compile the project:
$ sbt compile
Run the project:
$ sbt run
Package the project:
$ sbt package
Here is link to understand:
http://alvinalexander.com/scala/sbt-how-to-compile-run-package-scala-project

running sbt class using only a jar

Is there any way to run sbt commands with only a jar instead of a project?
I've been having issues using scopt with java or scala commands, and it only seems to work with sbt.
Ideally something like
sbt --jar <jar name>/"run-main <options"
You'd probably want to package everything up into something you can execute. One possibility would be to create a fat jar using something like sbt-assembly.
Once you've built your jar, you can then:
java -jar /path/to.jar --your-options
Take note that at this point you can only do what would have been the equivalent of sbt run-main with the jar. You cannot of course invoke any of the other sbt commands on the jar created.

Compile single scala file with TypeSafe Activator

I have Activator installed. Which means I have a full SBT on my system. I don't want to create a brand new activator project. All I want to do is compile a single scala file as we used to do with the scalac command. How can I do this please? Thanks.
You go into the directory containing your scala file and type "sbt compile" on the command line.
To run the program, you type "sbt run"
see also
http://www.scala-sbt.org/0.13/tutorial/Hello.html

Running Spark sbt project without sbt?

I have a Spark project which I can run from sbt console. However, when I try to run it from the command line, I get Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext. This is expected, because the Spark libs are listed as provided in the build.sbt.
How do I configure things so that I can run the JAR from the command line, without having to use sbt console?
To run Spark stand-alone you need to build a Spark assembly.
Run sbt/sbt assembly on the spark root dir. This will create: assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
Then you build your job jar with dependencies (either with sbt assembly or maven-shade-plugin)
You can use the resulting binaries to run your spark job from the command line:
ADD_JARS=job-jar-with-dependencies.jar SPARK_LOCAL_IP=<IP> java -cp spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar:job-jar-with-dependencies.jar com.example.jobs.SparkJob
Note: If you need other HDFS version, you need to follow additional steps before building the assembly. See About Hadoop Versions
Using sbt assembly plugin we can create a single jar. After doing that you can simply run it using java -jar command
For more details refer

Run junit4 test from cmd

I tried to run junit4 test case from command line using:
java -cp junit-4.8.1.jar;test\Dijkstra;test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
but I got the following error:
java.lang.NoClassDefFoundError: graph/shortestgraphpath;
while the test case is working without any problems in eclipse.
Hint: in eclipse, shortestgraphpath was added in Referenced Libraries.
You need to the jar file containing shortestgraphpath to java class path.
java -cp junit-4.8.1.jar;test\Dijkstra; test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
The class path is the value that you pass to java with -cp so in your question you just supply junitand your compiled classes.
Try updating it with the jar file with the missing class.
java -cp junit-4.8.1.jar;<path to jar file>;test\Dijkstra;test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
You might have to add additional jar files as well. I recommend that you take a look at some build tool to help you build and run your java applications for example Maven, Gradle, Buildr.