Export scala code, without a main function, to jar - scala

I am working with Spark streaming and have written a custom streaming adapter. I want to export this adapter as a jar and use it in my scala streaming jobs. When I refer the jar inside my streaming code, I am getting this error:
import org.custom.streaming
[ERROR] object custom is not a member of package org
Note that the adapter doesn't have any main method, so I can't use generic methods available online to export the project as a runnable JAR.
I also tried exporting it as a shaded JAR but in that case I am getting:
error: error while loading <root>, error in opening zip file
[EDIT]
I am using maven for packaging

Have you considered using the package command of your maven build file ?

Related

IntelliJ Scala maven build jar not generating class files

I have created spark scala(version 2.11) application and try to build using maven(version-3) using IntelliJ. At first time,able to compile and built the jar using maven successfully and able to test spark application using jar on cluster as well.Next time,I have modified some of the existing scala class code and tried to build again, code compiled and generate jar file successfully without any issues but there are no scala classes in latest jar file.I would like to know why maven build is not generating class file when I build.Can you please let me know what could be the problem and how Can I fix it ?
The easiest way to build scala applications for spark is to use SBT and fat jar plugin. Details were already described there:
How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA?
Just don't forget to exclude spark jars from fat jar with provided.

Reference uploaded JAR library

I've billed set of support function into helper.jar library and imported to Databricks cluster. The jar is installed on the cluster, but I'm not able to reference the functions in the library.
The jar import has been tested, cluster restarted and the jar can be referenced in InelliJ where it was developed as Azure Spark/HDInsight project.
//next line generates error value helper is not a member of org.apache.spark.sql.SparkSession
import helper
//nex line generates error: not found: value fn_conversion
display(df.withColumn("RevenueConstantUSD", fn_conversion($"Revenue"))
I'd expect the helper function would be visible after library deployment or possibly after adding the import command.
Edit: added information about IntelliJ project type

How to add external jar files to a spark scala project

I am trying to use an LSH implementation of Scala(https://github.com/marufaytekin/lsh-spark) in my Spark project.I cloned the repository with some changes to the sbt file (added Organisation)
To use this implementation , I compiled it using sbt compile and moved the jar file to the "lib" folder of my project and updated the sbt configuration file of my project , which looks like this ,
Now when I try to compile my project using sbt compile , It fails to load the external jar file ,showing the error message "unresolved dependency: com.lendap.spark.lsh.LSH#lsh-scala_2.10;0.0.1-SNAPSHOT: not found".
Am i following the right steps for adding an external jar file ?
How do i solve the dependency issue
As an alternative, you can build the lsh-spark project and add the jar in your spark application.
To add the external jars, addJar option can be used while executing spark application. Refer Running spark application on yarn
This issue isn't related to spark but to sbt configuration.
Make sure you followed the correct folder structure imposed by sbt and added your jar in the lib folder, as explained here - lib folder should be at the same level as build.sbt (cf. this post).
You might also want to check out this SO post.

How to use Phantom in Scala IDE

I want to use phantom with my scala IDE.So for this i clone the git hub repository and created a .jar file of phantom using sbt -> compile -> package.I add this .jar file to build path in my Scala IDE but still while importing
import com.websudos.phantom.connectors._
is throwing error that
object connector is not a member of com.websudos.phantom.
While using auto complete function of scala ide it is showing only the import for
import com.websudos.phantom.example
.I don't know if the jar files got created for example then why it is not created for other.
I search in internet but all other option are given as to add dependency in sbt build path but i dont want to use it.
Use sbt-assebly instead to create a fat jar.
https://github.com/sbt/sbt-assembly

Scala SBT: standalone jar

The answer: Making stand-alone jar with Simple Build Tool seems like what I need, but it did not have enough information for me, so this is a followup.
(1) How do I adapt the answer to my need? I don't understand what would need to be changed.
(2) What command do I run to create the standalone jar?
(3) Where can I find the jar after it has been created?
What I've tried:
Pasting the code in the linked answer verbatim into my: project/build/dsg.scala file. The file now has a
class ForkRun(info: ProjectInfo) extends DefaultProject(info)
(from before, used for running projects in a separate VM from SBT) and the new:
trait AssemblyProject extends BasicScalaProject
from the linked answer.
I also tried pasting the body (all defs and the lazy val of the AssemblyProject into the body of ForkRun.
To create a jar I ran package at the SBT prompt and get:
[info] Packaging ./target/scala_2.8.1/dsg_2.8.1-1.0.jar ...
[info] Packaging complete.
So I tried running the dsg_2.8.1-1.0.jar from the shell via:
java -jar dsg_2.8.1-1.0.jar
But I get:
Failed to load Main-Class manifest attribute from
dsg_2.8.1-1.0.jar
Could this be caused by having multiple entry points into my project? I select from a list when I execute run from the SBT prompt. Perhaps I need to specify the default when creating the package?
Here's a writeup I did on one way to make an executable jar with SBT:
http://janxspirit.blogspot.com/2011/01/create-executable-scala-jar-with-sbt.html
sbt-assembly is a sbt plugin to create a standalone jar of Scala sbt project with all of its dependencies.
Refer this post for more details with an example.