Importing Spark Source Code to Eclipse IDE - eclipse

I want to modify an algorithm from the Spark Source Code. In Eclipse Luna, I tried to import the source codes by File -> Import -> General -> Existing Projects into Workspace but after that, the src folder does not have any file. So how should I go about it?

Spark project consist of multiple modules
pom.xml
<modules>
<module>common/sketch</module>
<module>common/network-common</module>
<module>common/network-shuffle</module>
<module>common/unsafe</module>
<module>common/tags</module>
<module>core</module>
<module>graphx</module>
<module>mllib</module>
<module>tools</module>
<module>streaming</module>
<module>sql/catalyst</module>
<module>sql/core</module>
<module>sql/hive</module>
<module>external/docker-integration-tests</module>
<module>assembly</module>
<module>examples</module>
<module>repl</module>
<module>launcher</module>
<module>external/kafka</module>
<module>external/kafka-assembly</module>
</modules>
If you want to import complete Spark project try this:
File -> Import -> Select -> Maven -> Existing Maven Projects -> (Select the root directory of Spark project)
Note: Make sure you have eclipse-maven-plugin already installed.

cd to the home directory of your spark source code.
run :
mvn eclipse:eclipse
this will help you to convert your spark project to a eclipse-maven project.
then:
File -> Import -> Select -> Maven -> Existing Maven Projects -> (Select the root directory of Spark project)

Related

How to export Scala Spark project to jar

Im working on Scala/Spark project,i would like to export my project to jar file and run it into spark via spark-submit.
I tried this solution :
File -> Project Structure -> Artifacts -> + -> Jar -> From modules with dependencies -> Selected Main Class after browsing -> selected extract to the target jar -> Directory for META-INF automatically gets populated -> OK -> Apply -> OK -> Build -> Build Artifacts -> Build.
But i didn't find my main class on the jar file so i can't run it.
The basic Idea that you can follow:
As you are working on Scala
You can use sbt as your build management system to add all the dependencies to your project
You can use sbt assembly pluggin to build fat jar
Export this fat jar into your cluster to submit the spark jobs.
pls use Google to get more details...
or you can use this project https://github.com/khodeprasad/spark-scala-examples to start with and integrate sbt assembly plugin to create fat jars by following their documentation https://github.com/sbt/sbt-assembly

Error While Running spark on Eclipse

enter image description here
Getting this error while running spark with scala can anyone suggest how do i solve this issue.
You have not created this as a maven project. What you need to do is delete this project from eclipse. then :
File -> Import -> Maven -> Existing Maven Project -> Select your folder.
This would load your project and include maven dependencies and you would not face errors.

Eclipse - Wrong project name when importing maven project

I'm using eclipse mars and I'm working on a multi modules maven 2 project.
Since now it was always working fine when importing the project into eclispe.
My project structure is like
myproject
|-common
|-dao
|- ...
In each pom the artifact-id is
<artifactId>myproject-common</artifactId>
<artifactId>myproject-dao</artifactId>
...
What I always did till now
retrieve the project from svn
mvn clean install
mvn eclispe:eclipse
in eclipse import > existing project > browse to myproject and it displays the good project names : myproject-common, myproject-dao, ...
But since today the last step gives me the folder name in place of the artifact name : common, dao, ...
And after import, lot of errors saying eg. that Project 'dao' is missing required Java Project: 'myproject-common'
I don't understand. Same version of eclispe,maven and java than few months ago.
And after mvn eclipse:eclispe, the .project file is containing the good name
<projectDescription>
<name>myproject-common</name>
How can I solve that ?
I solved it by changing:
import > existing project
to:
Import -> Maven -> Existing Maven Projects

Error creating junit in groovy project

I just downloaded groovy plugin for eclipse4.2 from http://dist.springsource.org/release/GRECLIPSE/e4.2/ .I don't have any other installation/ library for groovy on my system. I am able to run groovy programs on my machine in eclipse.
However when I try to import org.junit.Test, I get following error.
Groovy:class org.junit.Test is not an annotation in #org.junit.Test
Groovy:unable to resolve class org.junit.Test
Can anyone tell me what might be the issue?
You must add the JUnit jars to your classpath. Select a Project -> Build Path -> Configure build path... -> Librarires -> Add library -> JUnit -> Next -> JUnit 4 -> OK.

Griffon dont create Eclipse .project and .classpath files

Griffon 0.9.2-beta-3
after exec "griffon create-app DemoConsole"
cant find the .project, so cant import it into eclipse(sts).
but the docs say that "Griffon automatically creates Eclipse .project and .classpath files for you",
did i miss some step?
solved by :
griffon integrate-with --eclipse
griffon install-plugin eclipse-support
griffon eclipse-update
eclipse --> import existing project
solved by :
griffon integrate-with --eclipse
griffon install-plugin eclipse-support
griffon eclipse-update
eclipse --> import existing project
In STS 2.6 (& probably other Eclipse versions) also need to add a classpath variable.
If you get errors relating to 'unbound classpath variable' and the files concerned start with GRIFFON_HOME/....
Select Eclipse/STS -> Preferences -> Java -> Build Path -> Classpath Variables
New ... -> Add 'GRIFFON_HOME' and navigate to location of your Griffon installation.
Confirming this & a rebuild of proj. should clear the errors.
If you want to do this on a per project basis - just right click the project & follow the same procedure under Build Path -> Libraries