scalac- create jar file from a scala file and its dependency - scala

Using scalac only and not sbt can I create a runnable jar file from a scala file "script.scala" and a dependent library "x.jar"?
This article https://alvinalexander.com/scala/how-create-executable-jar-file-scalac-run-scala/ says to create the jar by:-
scalac script.scala -d script.jar
How do I include my dependent library x.jar here?
When I do the below I do not see x.jar when I extract script.jar
scalac -classpath "x.jar" script.scala -d script.jar

Related

How to run jar generated by sbt package via sbt

I have a jar that was created by running sbt package
I've already set the main file in the jar in build.sbt
mainClass in (Compile, packageBin) := Some("com.company.mysql.Main")
addCommandAlias("updatemysql", "runMain com.company.mysql.Main")
I've tried
sbt "runMain target/scala-2.12/update-mysql_2.12-0.1-SNAPSHOT.jar"
sbt target/scala-2.12/update-mysql_2.12-0.1-SNAPSHOT.jar com.company.mysql.Main
sbt target/scala-2.12/update-mysql_2.12-0.1-SNAPSHOT.jar:com.company.mysql.Main
sbt update-mysql-assembly-0.1-SNAPSHOT.jar/run
sbt run update-mysql-assembly-0.1-SNAPSHOT.jar
^ this gives No main class detected even though main class is set in build.sbt as shown a few lines above.
I need to run the jar through sbt because it's the only way I know how to overwrite config file that is contained in the jar using -Dpath.to.config.param=new_value
In sbt run and runMain use classpath containing all the dependencies as well as folders with outputs of compilation tasks - which means that none of them takes JAR as an argument.
I think it would be possible to run this particular JAR from sbt by writing a custom task that would depend on output of package task (that is JAR filepath value) and run it as external process... though from the question it seems that this is not the actual problem.
The actual problem is running the JAR with flags passed into JVM instead of program itself which can be achieved by something like:
# clean assembly ensures that there is only 1 JAR in target
# update-mysql_2.12-*.jar picks the only JAR no matter what is its version
# -D arguments NEED to be passed before -jar to pass it to JVM and not the JAR
sbt clean assembly && \
java -Dpath.to.config.param=new_value -jar target/scala-2.12/update-mysql_2.12-*.jar

Where to put dependencies for scala

I'm getting this compile error:
owner#PC ~/scala/fxml: scalac x.scala
x.scala:1: error: object asynchttpclient is not a member of package org
import org.asynchttpclient.*;
^
one error found
I figured I needed to download the .java files for org.asynchttpclient.* so I copied those to c:\classes and set CLASS_PATH to c:\classes but that didn't work.
Note: I know about sbt and maven but I just want to get scalac working.
The error is with the dependency for x.scala. You need to download the asynchttpclient jar if you don't have it. Then apply the following command to include it in compilation.
scalac -classpath "asynchttpclient.jar:other dependent jars" x.scala
It's CLASSPATH, not CLASS_PATH. You can also use -classpath ... as an option to scalac.

create project jar in scala

I have a self-contained application in SBT. My data is stored on HDFS (the hadoop file system).How can I get a jar file to run my work on another machine.
The directory of my project is the following:
/MyProject
/target
/scala-2.11
/MyApp_2.11-1.0.jar
/src
/main
/scala
If you don't have any dependencies then running sbt package will create a jar will all your code.
You can then run your Spark app as:
$SPARK_HOME/bin/spark-submit --name "an-app" my-app.jar
If your project has external dependencies (other than spark itself; if it's just Spark or any of it's dependencies, then the above approach still works), then you have two options:
1) Use the sbt assembly plugin to create an uper jar with your entire class-path. Running sbt assembly will create another jar which you can use in the same way as before.
2) If you only have very few simple dependecies (say just joda-time), then you can simply include them into your spark-submit script.
$SPARK_HOME/bin/spark-submit --name "an-app" --packages "joda-time:joda-time:2.9.6" my-app.jar
Unlike Java, in Scala, the file’s package name doesn’t have to match the directory name. In fact, for simple tests like this,
you can place this file in the root directory of your SBT project, if you prefer.
From the root directory of the project, you can compile the project:
$ sbt compile
Run the project:
$ sbt run
Package the project:
$ sbt package
Here is link to understand:
http://alvinalexander.com/scala/sbt-how-to-compile-run-package-scala-project

Run junit4 test from cmd

I tried to run junit4 test case from command line using:
java -cp junit-4.8.1.jar;test\Dijkstra;test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
but I got the following error:
java.lang.NoClassDefFoundError: graph/shortestgraphpath;
while the test case is working without any problems in eclipse.
Hint: in eclipse, shortestgraphpath was added in Referenced Libraries.
You need to the jar file containing shortestgraphpath to java class path.
java -cp junit-4.8.1.jar;test\Dijkstra; test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
The class path is the value that you pass to java with -cp so in your question you just supply junitand your compiled classes.
Try updating it with the jar file with the missing class.
java -cp junit-4.8.1.jar;<path to jar file>;test\Dijkstra;test\Dijkstra\bin org.junit.runner.JUnitCore Data0PathTest00
You might have to add additional jar files as well. I recommend that you take a look at some build tool to help you build and run your java applications for example Maven, Gradle, Buildr.

How can I package a simple single Scala file as a stand-alone jar (no sbt)?

I have a very simple Scala file which I need to send to someone to execute on a machine without Scala (only Java). It's just one file, with one dependency on a jar (other than scala itself).
I'm struggling to package it properly. I'm not using sbt or anything. What's the simplest way to package it up?
Assuming you have a file Test.scala
object Test {
def main(args: Array[String]){
println("Hello world!")
}
}
compile it with scalac
scalac Test.scala
Create file Manifest.txt with the following content:
Manifest-Version: 1.0
Created-By: 1.6.0_31 (Sun Microsystems Inc.)
Main-Class: Test
Copy scala library from your distribution into the current folder and unzip it:
unzip scala-library.jar
execute the command:
jar cvfm Hello.jar Manifest.txt *.class library.properties scala/
Send it to your addressee. He/she will have to execute
java -cp Hello.jar