IntelliJ Scala maven build jar not generating class files - scala

I have created spark scala(version 2.11) application and try to build using maven(version-3) using IntelliJ. At first time,able to compile and built the jar using maven successfully and able to test spark application using jar on cluster as well.Next time,I have modified some of the existing scala class code and tried to build again, code compiled and generate jar file successfully without any issues but there are no scala classes in latest jar file.I would like to know why maven build is not generating class file when I build.Can you please let me know what could be the problem and how Can I fix it ?

The easiest way to build scala applications for spark is to use SBT and fat jar plugin. Details were already described there:
How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA?
Just don't forget to exclude spark jars from fat jar with provided.

Related

Spark how to prefer class from packaged jar

I am using sbt assembly plugin to create a fat jar. I need some jars which are part of default hadoop/spark but with newer versions.
I want spark worker jvm to prefer the version that is packaged with my fat jar file and not the default hadoop/spark distribution. How can I do this?
The solution to this is to set spark.{driver,executor}.userClassPathFirst in configuration(--conf option) while submitting the spark application. This will first include jars from uber jar and then from spark classpath.
Other solution is to use shading in sbt assembly. And shade the jars in our uber jar whose previous version are included with spark.

Chisel: Compiling Chisel library on Windows

I have been using sbt on windows and a custom build.sbt script in conjunction with an import Chisel._ in the top-level file in order to generate Verilog from my Chisel source successfully.
I'm trying to get an IDE working on Windows to expedite Chisel development. I've gone with the Eclipse based SCALA IDE http://scala-ide.org/download/sdk.html/
I want to compile the Chisel library so that the import Chisel._ can be resolved locally, without having to go off and download the source from the repository each timeand recompile the source. When I download the Chisel-master repo from Git and include the src\main folder in my SCALA project in the SCALA IDE, I get lots of syntax errors in the Chisel SCALA files that prevent me from building the project.
Has anyone done anything like this before on Windows or have any knowledge of working with the SCALA IDE as it may just be a case of undefined symbols in the project configuration?
Not sure exactly what you did with build.sbt respect to recompile (I think it download it only the first time, then it caches it for the future). But I'm using ScalaIDE for Chisel on linux, using the default build.sbt files, maybe you can try to get it working out of the box first to help narrow down the issue.
Here are the steps I took in order to get ScalaIDE work with Chisel:
the latest Scala IDE uses 2.11.8, the current Chisel repository defaults to 2.11.7. So I had to change all the build.sbt reference to scalaVersion from 2.11.7 to 2.11.8
I used sbteclipse
https://github.com/typesafehub/sbteclipse
To create importable the workspace to setup the compilation dependencies.
Except for chiselFrontEnd. For some reason, this package is not added to the dependency. I have to Add chiselFrontEnd as a javabuildpath dependency manually (Properties/JavaBuildPath, under Projects) for my own projects.
To resolve undefined symbols, you can also add a JAR onto the project build path using Project Properties > Java Build Path > Libraries > Add External JARs...
If you are getting your JARs through Maven / SBT, they should be in:
C:\Users\<name>\.ivy2\local\edu.berkeley.cs\chisel3_2.11\jars
If you are using publish-local with chisel3, your JARs should be in
C:\Users\<name>\.ivy2\cache\edu.berkeley.cs\chisel3_2.11\jars
Note that chisel3 is compiled into one JAR, including coreMacros and chiselFrontend sub-projects
Of course, this is a more quick-and-dirty solution compared to something that can parse SBT files.

How to add external jar files to a spark scala project

I am trying to use an LSH implementation of Scala(https://github.com/marufaytekin/lsh-spark) in my Spark project.I cloned the repository with some changes to the sbt file (added Organisation)
To use this implementation , I compiled it using sbt compile and moved the jar file to the "lib" folder of my project and updated the sbt configuration file of my project , which looks like this ,
Now when I try to compile my project using sbt compile , It fails to load the external jar file ,showing the error message "unresolved dependency: com.lendap.spark.lsh.LSH#lsh-scala_2.10;0.0.1-SNAPSHOT: not found".
Am i following the right steps for adding an external jar file ?
How do i solve the dependency issue
As an alternative, you can build the lsh-spark project and add the jar in your spark application.
To add the external jars, addJar option can be used while executing spark application. Refer Running spark application on yarn
This issue isn't related to spark but to sbt configuration.
Make sure you followed the correct folder structure imposed by sbt and added your jar in the lib folder, as explained here - lib folder should be at the same level as build.sbt (cf. this post).
You might also want to check out this SO post.

How to use Phantom in Scala IDE

I want to use phantom with my scala IDE.So for this i clone the git hub repository and created a .jar file of phantom using sbt -> compile -> package.I add this .jar file to build path in my Scala IDE but still while importing
import com.websudos.phantom.connectors._
is throwing error that
object connector is not a member of com.websudos.phantom.
While using auto complete function of scala ide it is showing only the import for
import com.websudos.phantom.example
.I don't know if the jar files got created for example then why it is not created for other.
I search in internet but all other option are given as to add dependency in sbt build path but i dont want to use it.
Use sbt-assebly instead to create a fat jar.
https://github.com/sbt/sbt-assembly

sbt eclipse command changed files and packages

I created a new Scala project in eclipse then added a package and Scala object ,
So far so good ...
i want to add external library so i added a project folder with build.properties plugins.sbt files,and another file build.sbt in the root project.
in the terminal i compiled successfully the project with the sbt compile task.
the problem is that after sbt eclipse command the eclipse project changed from Scala project to something else.... all the packages changed to simple folders and the Scala project is ruined
scala IDE :Build id: 3.0.3-20140327-1716-Typesafe
scala version :2.10.4
sbt version:0.13.0
you can see in the image
Where did you get the eclipse command from? I'm guessing you're using sbteclipse.
I created a new Scala project in eclipse then added a package and Scala object , So far so good ...
If I understand correctly, this is exactly the opposite of what the plugin is intended to do. I think you're suppose to create a plain sbt project, and then let the plugin generate the Eclipse project.