scala play unmanaged jar added but import not working - scala

I'm trying to add Jep to a Scala (2.11.8) Play Framework (2.5.8) project of mine.
As far as I can tell, Sbt can see the unmanaged jar:
[play-scala] $ show unmanagedClasspath
[info] List(Attributed(/home/stondo/dev/git/play-dashboard-mongo/lib/jep.cpython-35m-x86_64-linux-gnu.so), Attributed(/home/stondo/dev/git/play-dashboard-mongo/lib/libjep.so), Attributed(/home/stondo/dev/git/play-dashboard-mongo/lib/jep-3.6.0.jar))
but when I run a very simple test it fails:
[error] cannot create an instance for class IntegrationSpec
...
[error] CAUSED BY java.lang.UnsatisfiedLinkError: no jep in java.library.path
...
Let me mention that running scala -cp /path/to/myjar and then importing Jep, works:
scala -cp ./lib/jep-3.6.0.jar
scala> import jep.Jep
import jep.Jep
Any ideas about what's going on?
Thanks in advance

It's not a problem of import-not-working. It's a problem of cannot loading the native library. Unlike java libraries, native libraries (jep.cpython-35m-x86_64-linux-gnu.so) must be put in some directory listed in either the PATH environment variable or the "java.library.path" system property.

Related

Running scala code using java -jar <jarfile>

I am trying to run scala code using java -jar <> i am getting below issue
ERROR:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataOutputStream at com.cargill.finance.cdp.blackline.Ingest.main(Ingest.scala) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataOutputStream
The same code is running fine with spark-submit.
I am trying to write data to hdfs file.
I have imported below classes
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.FileSystem
import org.apache.hadoop.fs.Path
import org.apache.hadoop.fs.FSDataOutputStream
You need to add all dependencies (including transitive dependencies, i.e. dependencies of dependencies) to -cp argument. If you just look at direct dependencies of hadoop-core you'll see why you should never do this manually. Instead use a build system. If you followed e.g. https://spark.apache.org/docs/latest/quick-start.html it actually sets up SBT, so you can do sbt run to run the main class like java -cp <lots of libraries> -jar <jarfile> would). If you didn't, add build.sbt as described there.

Why can't scala find what sbt can?

With sbt everything is fine:
» sbt console
[info] Loading project definition from /repos/myrepo/project
[info] Set current project to bpavscan (in build file:/repos/myrepo/)
[info] Starting scala interpreter...
[info]
Welcome to Scala 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131).
Type in expressions for evaluation. Or try :help.
scala> import play.api.libs.json._
import play.api.libs.json._
scala>
But if I do it with the scala tool:
» scala
Welcome to Scala version 2.11.6 (OpenJDK 64-Bit Server VM, Java 1.8.0_131).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import play.api.libs.json._
<console>:7: error: not found: value play
import play.api.libs.json._
^
scala>
I need to run a simple script, which I usually do with:
scala myscript.scala
But since now my script has a play dependency, I can not run it with scala anymore, since scala does not find play.
I need to either:
Be able to load the play framework with the simple scala tool
Be able to run a simple script with sbt: sbt run runs my project, which I do not want. I want to run a simple script (to try out some simple things)
sbt console with load the same console/REPL as of scala but with additional loaded dependencies defined in build.sbt. So before loading the console, all the dependent libraries are added. And this is reason you could import play libraries while using sbt console.
On the other hand scala starts the console with the libraries inside scala-package of the system. Thus needs additional jars being included inside the package for importing. For the above case if play library jar was included in scala directory then import play.api.libs.json._ should have worked for scala console too.

Scala SBT: standalone jar

The answer: Making stand-alone jar with Simple Build Tool seems like what I need, but it did not have enough information for me, so this is a followup.
(1) How do I adapt the answer to my need? I don't understand what would need to be changed.
(2) What command do I run to create the standalone jar?
(3) Where can I find the jar after it has been created?
What I've tried:
Pasting the code in the linked answer verbatim into my: project/build/dsg.scala file. The file now has a
class ForkRun(info: ProjectInfo) extends DefaultProject(info)
(from before, used for running projects in a separate VM from SBT) and the new:
trait AssemblyProject extends BasicScalaProject
from the linked answer.
I also tried pasting the body (all defs and the lazy val of the AssemblyProject into the body of ForkRun.
To create a jar I ran package at the SBT prompt and get:
[info] Packaging ./target/scala_2.8.1/dsg_2.8.1-1.0.jar ...
[info] Packaging complete.
So I tried running the dsg_2.8.1-1.0.jar from the shell via:
java -jar dsg_2.8.1-1.0.jar
But I get:
Failed to load Main-Class manifest attribute from
dsg_2.8.1-1.0.jar
Could this be caused by having multiple entry points into my project? I select from a list when I execute run from the SBT prompt. Perhaps I need to specify the default when creating the package?
Here's a writeup I did on one way to make an executable jar with SBT:
http://janxspirit.blogspot.com/2011/01/create-executable-scala-jar-with-sbt.html
sbt-assembly is a sbt plugin to create a standalone jar of Scala sbt project with all of its dependencies.
Refer this post for more details with an example.

Scala : trying to get log4j working

Scala newb here (it's my 2nd day of using it). I want to get log4j logging working in my Scala script. The script and the results are below, any ideas as to what's going wrong?
[sean#ibmp2 pybackup]$ cat backup.scala
import org.apache.log4j._
val log = LogFactory.getLog()
log.info("started backup")
[sean#ibmp2 pybackup]$ scala -cp log4j-1.2.16.jar:. backup.scala
/home/sean/projects/personal/pybackup/backup.scala:1: error: value apache is not a member of package org
import org.apache.log4j._
^
one error found
I reproduce it under Windows: delimiter of '-classpath' must be ';' there (not ':'). Are you use cygwin or some sort of unix emulator?
But Scala script works anywhere without current dir in classpath. Try to use:
$ scala -cp log4j-1.2.16.jar backup.scala
JFI: LogFactory is a class of slf4j library (not log4j).
UPDATE
Another possible case: broken jar in classpath, maybe during download or something else. Scala interpreter does report only about unavailable member of the package.
$ echo "qwerty" > example.jar
$ scala -cp example.jar backup.scala
backup.scala:1: error: value apache is not a member of package org
...
Need to inspect content of the jar-file:
$ jar -tf log4j-1.2.16.jar
...
org/apache/log4j/Appender.class
...
Did you remember to put log4j.jar in your classpath?
Had Similar issue when started doing Scala Development using Eclipse, doing a clean build solved the problem.
Guess the Scala tools are not matured et.
Instead of using log4j directly, you might try using Configgy. It's the Scala Way™ to work with log4j, as well as configuration files. It also plays nicely with SBT and Maven.
I asked and answered this question myself, have a look:
Put it under src/main/resources/logback.xml. It will be copied to the right location when SBT is doing the artifact assembly.

"is not a member of package" error when importing package in Scala with SBT

(Relative beginner here, please be gentle...)
I've got a Scala program that I can build with sbt. I can (from within sbt) run compile and test-compile with no errors. I've defined a package by putting package com.mycompany.mypackagename at the top of several .scala files. When I do console to get a Scala REPL, this happens:
scala> import com.mycompany.mypackagename._
<console>:5: error: value mypackagename is not a member of package com.mycompany
import com.mycompany.mypackagename._
Any variation of this also fails. When I just do import com.mycompany I get no problems.
I thought that running the Scala console from within sbt would properly set the classpath based on the current projects? What (completely obvious) thing am I missing?
I ran into this same problem, and then I realized I was running scala 2.10.0 on commandline, and IDEA was using Scala 2.9.2. So the fix was to change both to use the same version, and:
sbt clean
What will happen if you import actual class name instead of wildcard.
import com.mycompany.mypackagename.ActualClassName