java.lang.RuntimeException: You must run the `stage` task before deploying your app when running `sbt stage deployHeroku` - scala

I am trying to deploy my application to Heroku using sbt-nativepackager and sbt-heroku.
My code is available at https://github.com/hhimanshu/sbt101/tree/m5 (branch is m5)
When I run sbt stage deployHeroku, the application fails as below
➜ sbt101 git:(m5) ✗ sbt stage deployHeroku
[info] Loading global plugins from /Users/harit/.sbt/1.0/plugins
[info] Loading settings for project sbt101-build from plugins.sbt ...
[info] Loading project definition from /Users/harit/code/sc/sbt101/project
[info] Loading settings for project root from build.sbt ...
[info] Set current project to sbt101 (in build file:/Users/harit/code/sc/sbt101/)
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT-sources.jar ...
[info] Done packaging.
[info] Wrote /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT.pom
[info] Wrote /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT.pom
[info] Main Scala API documentation to /Users/harit/code/sc/sbt101/api/target/scala-2.12/api...
[info] Compiling 1 Scala source to /Users/harit/code/sc/sbt101/api/target/scala-2.12/classes ...
model contains 3 documentable templates
[info] Done compiling.
[info] Main Scala API documentation successful.
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT-javadoc.jar ...
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Main Scala API documentation to /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/api...
[info] Compiling 1 Scala source to /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/classes ...
[info] Done packaging.
[warn] there was one feature warning; re-run with -feature for details
model contains 5 documentable templates
[warn] one warning found
[info] Main Scala API documentation successful.
[info] Packaging /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT-javadoc.jar ...
[info] Done packaging.
[warn] there was one deprecation warning (since 2.11.0); re-run with -deprecation for details
[warn] there was one feature warning; re-run with -feature for details
[warn] two warnings found
[info] Done compiling.
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Packaging /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 4 s, completed 10-May-2019 4:20:03 PM
[error] java.lang.RuntimeException: You must run the `stage` task before deploying your app!
[error] at com.heroku.sbt.SbtApp.packageType(SbtApp.scala:142)
[error] at com.heroku.sbt.SbtApp.prepare(SbtApp.scala:111)
[error] at com.heroku.sdk.deploy.App.deploy(App.java:60)
[error] at com.heroku.sbt.SbtApp.deploy(SbtApp.scala:98)
[error] at com.heroku.sbt.HerokuPlugin$autoImport$.$anonfun$baseHerokuSettings$1(HerokuPlugin.scala:53)
[error] at com.heroku.sbt.HerokuPlugin$autoImport$.$anonfun$baseHerokuSettings$1$adapted(HerokuPlugin.scala:26)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:40)
[error] at sbt.std.Transform$$anon$4.work(System.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:269)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:278)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:269)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
2019-05-10 16:20:03,606 Log4j2-TF-1-AsyncLogger[AsyncContext#cb644e]-1 ERROR Attempted to append to non-started appender heroku-logger
[error] (Compile / deployHeroku) You must run the `stage` task before deploying your app!
2019-05-10 16:20:03,607 Log4j2-TF-1-AsyncLogger[AsyncContext#cb644e]-1 ERROR Attempted to append to non-started appender heroku-logger
[error] Total time: 0 s, completed 10-May-2019 4:20:03 PM
However, using the Heroku toolbelt on command-line, I have been successful in deploying my app
➜ sbt101 git:(m5) ✗ git push heroku m5:master
The the app runs at https://h2-sbt101.herokuapp.com/rates
Can someone please help me understand what I may be missing?

I had the same problem. sbt deployHeroku looks for the directory target/universal/stage (see the source). However, it seems to look for it in the root project which may not be the one with the staged directory. For example, in the OP's log, it seems there are several projects called api and calculator. In my case, the correct one (containing the server code) was server.
So sbt stage server/deployHeroku worked for me.

Related

Facing issue with sbt dependency

I try to use the phoenix-spark jar to laod phoenix table to Spark 2.2.3 DataFrame
adding this dependency:
libraryDependencies += "org.apache.phoenix" % "phoenix-spark2" % "4.7.0.2.6.5.1102-5"
I tested this two resolvers one by one:
resolvers += "Hortonworks Repository" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/groups/public/"
I had the folowing error:
[info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_261)
[info] loading project definition from /home/ambac61n/IdeaProjects/phoenix_test/project
[info] loading settings for project phoenix_test from build.sbt ...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[info] sbt server started at local:///home/ambac61n/.sbt/1.0/server/0c2856c06fe3f2cf2706/sock
sbt:phoenix_test>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/ambac61n/IdeaProjects/phoenix_test/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /home/my_user/.local/share/JetBrains/IdeaIC2020.2/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] Total time: 3 s, completed 22 août 2020 05:56:14
[info] shutting down sbt server
Do you have any Idea?
After vising those repositories, I noticed that there is no package, indeed.
For the first repository,
https://repo1.maven.org/maven2/org/apache/phoenix/
No package for the phoenix-spark2
and the second repository,
https://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/
No package for the 4.7.0.2.6.5.1102-5 version
Try with the other versions.

vaadin plugin for eclipse build failure on compile theme

Iam getting a build failure every time i try to compile my themes. The compilation seems to work. Iam just wondering about the console log which is saying(in extracts):
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.503 s
[INFO] Finished at: 2017-01-19T17:15:48+01:00
[INFO] Final Memory: 19M/303M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal com.vaadin:vaadin-maven-plugin:7.6.6:compile-theme (default-cli) on project admintool: Compiling theme "VAADIN/themes/nxo-fr" failed: Command [[
[ERROR] /bin/sh -c /opt/jdk/jdk1.8.0_91/jre/bin/java -classpath /home/cia/dev/projects/trunk/trunk/admintool/src/main/webapp:/home/cia/dev/projects/trunk/trunk/admintool/target/classes:/home/cia/dev/projects/trunk/trunk/admintool/src/main/java:/home/cia/dev/projects/trunk/trunk/admintool/src/main/resources:/home/cia/.m2/repository/com/sharis/bl/admintool-widgetset/7.6.7/admintool-widgetset-7.6.7.jar:/home/cia/.m2/repository/com/vaadin/vaadin-server/7.6.7/vaadin-server-7.6.7.jar:/home/cia/.m2/repository/com/vaadin/vaadin-sass-compiler/0.9.13/vaadin-sass-compiler-0.9.13.jar:/home/cia/.m2/repository/org/w3c/css/sac/1.3/sac-....................................................................1.5.10.jar com.vaadin.sass.SassCompiler /home/cia/dev/projects/trunk/trunk/admintool/src/main/webapp/VAADIN/themes/nxo-fr/styles.scss /home/cia/dev/projects/trunk/trunk/admintool/src/main/webapp/VAADIN/themes/nxo-fr/styles.css
[ERROR] ]] failed with status 2
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
I found the issue. The styles.scss file was missing in my older theme folders like runo etc.
So in each theme folder there must be a styles.scss for proper compilation.
Try to use a theme name which has no hyphens in it.
"VAADIN/themes/nxo-fr"

Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2

While compiling the Maven project the following error occured:
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.992s
[INFO] Finished at: Fri Apr 15 17:44:33 CEST 2016
[INFO] Final Memory: 25M/350M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
I removed the property <useZincServer>true</useZincServer> from pom.xml, and still the Logging error persists.
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.814s
[INFO] Finished at: Fri Apr 15 17:41:00 CEST 2016
[INFO] Final Memory: 25M/335M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
I checked that PATH and JAVA_HOME are defined in ~/.bashrc as follows:
export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
The only issue that I noticed is that echo $JAVA_HOME gives an empty output, though I did source ~/.bashrc.
Any help is highly appreciated.
The problem could be this [INFO] Using incremental compilation
In your pom.xml try to remove the line <recompileMode>incremental</recompileMode>
and then try again.
It is strange echo $JAVA_HOME gives an empty output.While compling the Spakr Source,I import the mvn clean package success porject into eclipse,I meet the same problem.And I found the solution here:
How to solve “Plugin execution not covered by lifecycle configuration” for Spring Data Maven Builds
I think you were compiling Spark with Scala 2.10. If so, you should do as follows.
cd /path/to/Spark
./dev/change-scala-version.sh 2.10
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
Hope this helps.

Maven build issue with codehaus package

I am trying to build the ear using Maven and received the following error.
[INFO] SASC .............................................. SUCCESS [10.134s]
[INFO] SASC-war .......................................... SUCCESS [41.967s]
[INFO] SASC-ear .......................................... FAILURE [2.764s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 55.162s
[INFO] Finished at: Tue Dec 15 10:52:50 CST 2015
[INFO] Final Memory: 9M/93M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-ear-plugin:2.6:generate-application-xml (default-generate-application-xml) on project SASC-ear: Execution default-generate-application-xml of goal org.apache.maven.plugins:maven-ear-plugin:2.6:generate-application-xml failed: A required class was missing while executing org.apache.maven.plugins:maven-ear-plugin:2.6:generate-application-xml: org/codehaus/plexus/util/xml/XmlStreamWriter
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.apache.maven.plugins:maven-ear-plugin:2.6
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/C:/Users/xxx/.m2/repository/org/apache/maven/plugins/maven-ear-plugin/2.6/maven-ear-plugin-2.6.jar
[ERROR] urls[1] = file:/C:/Users/xxx/.m2/repository/org/codehaus/plexus/plexus-utils/1.1/plexus-utils-1.1.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
[ERROR]
[ERROR] -----------------------------------------------------: org.codehaus.plexus.util.xml.XmlStreamWriter
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :SASC-ear
Can anyone help me to fix this? I have tried to delete the .m2/repository folder and that is not helpful to solve the issue.

Using scala-eclipse for spark

Could some please help me on how to use the scala-eclipse IDE for spark ?
I came across this link - http://syndeticlogic.net/?p=311. But I am unable to follow it.
I entered the command - mvn -Phadoop2 eclipse:clean eclipse:eclipse inside the spark directory after a long list of downloads it gave me some error. Please help. Thanks
Below is the error i received
Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [5:22.386s]
[INFO] Spark Project Core ................................ SUCCESS [17:20.807s]
[INFO] Spark Project Bagel ............................... FAILURE [2.159s]
[INFO] Spark Project GraphX .............................. SKIPPED
[INFO] Spark Project ML Library .......................... SKIPPED
[INFO] Spark Project Streaming ........................... SKIPPED
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Catalyst ............................ SKIPPED
[INFO] Spark Project SQL ................................. SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:15.115s
[INFO] Finished at: Wed May 07 15:27:51 GMT+05:30 2014
[INFO] Final Memory: 22M/81M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop2" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on project spark-bagel_2.10: Failed to resolve dependencies for one or more projects in the reactor. Reason: Missing:
[ERROR] ----------
[ERROR] 1) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] Try downloading the file manually from the project website.
[ERROR]
[ERROR] Then, install it using the command:
[ERROR] mvn install:install-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file
[ERROR]
[ERROR] Alternatively, if you host your own repository you can deploy the file there:
[ERROR] mvn deploy:deploy-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR]
[ERROR] Path to dependency:
[ERROR] 1) org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 2) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] ----------
[ERROR] 1 required artifact is missing.
[ERROR]
[ERROR] for artifact:
[ERROR] org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] from the specified remote repositories:
[ERROR] maven-repo (http://repo.maven.apache.org/maven2, releases=true, snapshots=false),
[ERROR] apache-repo (https://repository.apache.org/content/repositories/releases, releases=true, snapshots=false),
[ERROR] jboss-repo (https://repository.jboss.org/nexus/content/repositories/releases, releases=true, snapshots=false),
[ERROR] mqtt-repo (https://repo.eclipse.org/content/repositories/paho-releases, releases=true, snapshots=false),
[ERROR] cloudera-repo (https://repository.cloudera.com/artifactory/cloudera-repos, releases=true, snapshots=false),
[ERROR] apache.snapshots (http://repository.apache.org/snapshots, releases=false, snapshots=true),
[ERROR] central (http://repo.maven.apache.org/maven2, releases=true, snapshots=false)
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-bagel_2.10
This is because there is no profile called hadoop2 in the pom.xml. The closest matches are hadoop-2.2,hadoop-2.3 etc.
You can run the following
mvn -Phadoop-2.2 eclipse:clean eclipse:eclipse
or you may run 'mvn help:all-profiles' to list all the profiles and use one from it
If you want to contribute to the Apache Spark project, then
Go to spark home and run sbt/sbt eclipse
In Scala IDE, Select File | Import | Existing Projects into Workspace.
Select root directory :MY_SPARK_HOME
Select Search for Nested Projects
Select the projects that you want
Do not select "Copy projects into workspace".
If you want to use the spark libraries in an application that you are using then,
- You can create a jar using the sbt/sbt assembly command and then add that jar as a library to your application project
Also refer to the eclipse documentation here at: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-Eclipse