Scala -cp doesn't work - scala

I'm trying to run a file called test.scala which refers to a library called moarcoref-assembly-1.jar. I'm very sure that the library jar file contains the class edu.berkeley.nlp.coref.NumberGenderComputer but scala keeps complaining at the import command. It even doesn't seem to find the edu package.
$ cat test.scala
import java.io._
import scala.collection.mutable.ListBuffer
import scala.collection.mutable.ArrayBuffer
import scala.io.Source
import edu.berkeley.nlp.coref.NumberGenderComputer
println("Hello, world!")
This is the error:
$ scala -cp "moarcoref-assembly-1.jar:." test.scala
/data/EvEn/nn_coref/modifiedBCS/test.scala:5: error: not found: value edu
import edu.berkeley.nlp.coref.NumberGenderComputer
^
one error found
I'm using version 2.11.6 and can't install a newer one.
$ scala -version
Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL

Related

How to compile a scala program without any builder?

I am trying to compile this simple code in scala to run it in spark:
import org.apache.spark.sql.SparkSession
object Main {
def main(args: Array[String]) {
if (args.length < 1) {
System.err.println("Usage: HDFStEST <file>")
System.exit(1)
}
val spark = SparkSession.builder.appName("TesteHelena").getOrCreate()
println("hellooo")
spark.stop() }
}
I don't know how to make scalac find the dependency org.apache.spark.sql.SparkSession
I try to set where are the jars files with the following command:
scalac main.scala -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars -d main.jar
which returns me the error:
main.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.SparkSession
and if I just send every jar file with the command:
scalac main.scala -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\* org.apache.spark.sql.SparkSession -d main.jar
it returns me the error:
error: IO error while decoding C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\aircompressor-0.10.jar with UTF-8
for every jar file.
The command:
scalac main.scala -cp org.apache.spark.sql.SparkSession -d main.jar
returns me:
main.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.SparkSession
So, is there a way to use the Spark dependency in scalac to compile a program? I cannot use any dependency builder, like sbt and gradle, because I don't have access to internet in my terminal, due to security issues of my job, and they call those dependencies in their repository.
I solved my issue with the command:
scalac -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\ -extdirs C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\ main.scala -d main1.jar
so, I added the scalac's option "-extdirs", which overrides the location of installed extensions. And it worked!

Compile scala code with "import org.apache.spark.SparkContext" without sbt

I have started to build a small scala code.
It will be expanded to open SparkContext and do some data processing.
As of now I have only imported 3 spark libraries to run a simple code.
I am getting error "error: object apache is not a member of package org".
Question is: How can I compile manually using scalac so that the compilation process can include the Spark libraries - without Maven or sbt?
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object HelloWorld {
def main(args: Array[String]) {
println("Hello, world!")
}
}

Scala REPL additional Jars

I am adding jars into my scala repl like so:
scala> :cp scalaj-http_2.10-2.2.1.jar
Added '/home/XXX/scalaj-http_2.10-2.2.1.jar'. Your new classpath is:
".:/home/XXX/json4s-native_2.10-3.3.0.RC3.jar:/home/XXX/scalaj-http_2.10-2.2.1.jar"
Nothing to replay.
Now when I try and import that jar for use I get an error:
scala> import scalaj.http._
<console>:7: error: not found: value scalaj
import scalaj.http._
I've tried this on another jar:
scala> :cp json4s-native_2.10-3.3.0.RC3.jar
Added '/home/XXX/json4s-native_2.10-3.3.0.RC3.jar'. Your new classpath is:
".:/home/XXX/json4s-native_2.10-3.3.0.RC3.jar"
Nothing to replay.
scala> import org.json4s.JsonDSL._
<console>:7: error: object json4s is not a member of package org
import org.json4s.JsonDSL._
I've read multuple tutorials online that all do it this way but my REPL does not seem to be behaving in the same manor.
I am using Scala 2.10
Double check your path, if it still is not working you can try adding the jar at the time you start the REPL (it's always seemed to work for me, even with v2.10)
scala -cp /home/XXX/json4s-native_2.10-3.3.0.RC3.jar:/home/XXX/scalaj-http_2.10-2.2.1.jar
Note: That the delimeter between jars is ; for Windows and : otherwise.

Scala class not found on classpath when specifying jar in classpath

I am attempting to use ScalaCheck. Below is my HelloWorld.scala Scala code which imports from ScalaCheck and uses the Gen.const method.
package com
import org.scalacheck._
import org.scalacheck.Gen._
import org.scalacheck.Arbitrary.arbitrary
sealed abstract class Tree
case class Node(left: Tree, right: Tree, v: Int) extends Tree
case object Leaf extends Tree
object HelloWorld {
val genLeaf = Gen.const(Leaf)
def main(args: Array[String]) {
println("Hello, world!")
}
}
Compile by typing (this works)
scalac -cp scalacheck_2.11-1.11.5.jar com/HelloWorld.scala
Execute by typing (2 alternatives)
scala -cp scalacheck_2.11-1.11.5.jar com.HelloWorld
scala -cp "scalacheck_2.11-1.11.5.jar;./com" com.HelloWorld
Output of scala
No such file or class on classpath: com.HelloWorld
When I remove all ScalaCheck code in HelloWorld.scala and compile without using the -cp flag everything works. When adding the ScalaCheck code and the jar to the -cp flag, I get the above error.
How do I correctly setup the classpath?
(Versions:
scalac -version
Scala compiler version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL
scala -version
Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL
)
OS: Linux
Which OS are you using? If not Windows, the path separator should be :, try this
scala -cp "scalacheck_2.11-1.11.5.jar:." com.HelloWorld

How to fire up Scala interpreter with ScalaCheck in the classpath in Ubuntu 11.10?

Scala is installed and working fine.
scalacheck.jar is placed in the /bin .
I used the following command
$ scala -cp scalacheck.jar
After that, when i tried the below command,
scala> import org.scalacheck.Prop.forAll
I got the following error.
<console>:7: error: object scalacheck is not a member of package org
import org.scalacheck.Properties
^
I might have done some mistake in using scalacheck, please correct me and give the proper commands so that I can able to work with scalacheck in Ubuntu in interpreter mode.
Putting executable on the path isn't the same as jar being on the classpath, so your jar being in /bin didn't change anything.
Just use:
scala -cp path_to_your.jar
and you should be fine.
If for example, your scalachek.jar is in /bin then use:
scala -cp /bin/scalacheck.jar
edit:
Putting jars in /bin probably isn't the best idea.
You can use it like this:
kjozsa#walrus:~$ scala -version
Scala code runner version 2.9.2 -- Copyright 2002-2011, LAMP/EPFL
kjozsa#walrus:~$ locate scalacheck.jar
/usr/share/scala/lib/scalacheck.jar
kjozsa#walrus:~$ scala -cp /usr/share/scala/lib/scalacheck.jar
Welcome to Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.7.0_03-icedtea).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.scalacheck.Prop.forAll
import org.scalacheck.Prop.forAll
scala>