Facing issue with sbt dependency - scala

I try to use the phoenix-spark jar to laod phoenix table to Spark 2.2.3 DataFrame
adding this dependency:
libraryDependencies += "org.apache.phoenix" % "phoenix-spark2" % "4.7.0.2.6.5.1102-5"
I tested this two resolvers one by one:
resolvers += "Hortonworks Repository" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/groups/public/"
I had the folowing error:
[info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_261)
[info] loading project definition from /home/ambac61n/IdeaProjects/phoenix_test/project
[info] loading settings for project phoenix_test from build.sbt ...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[info] sbt server started at local:///home/ambac61n/.sbt/1.0/server/0c2856c06fe3f2cf2706/sock
sbt:phoenix_test>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/ambac61n/IdeaProjects/phoenix_test/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /home/my_user/.local/share/JetBrains/IdeaIC2020.2/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] Total time: 3 s, completed 22 août 2020 05:56:14
[info] shutting down sbt server
Do you have any Idea?

After vising those repositories, I noticed that there is no package, indeed.
For the first repository,
https://repo1.maven.org/maven2/org/apache/phoenix/
No package for the phoenix-spark2
and the second repository,
https://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/
No package for the 4.7.0.2.6.5.1102-5 version
Try with the other versions.

Related

SBT failing to resolve dependency [ResolveException]

I have the following in my build.sbt file:
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
scalaVersion := "2.13.4",
)),
name := "ScalaTest",
resolvers += "spigot-repo" at "https://hub.spigotmc.org/nexus/content/repositories/snapshots/",
libraryDependencies += "org.spigotmc" % "spigot-api" % "1.16.5-R0.1-SNAPSHOT" % "provided" intransitive(),
version := "1.0"
)
Which, when using IntelliJ and trying to refresh / reimport the project, a few issues occur:
Dependencies won't resolve despite working with maven / Gradle.
Produces the following error
/Users/harrydrummond/Library/Java/JavaVirtualMachines/openjdk-15.0.1/Contents/Home/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /Users/harrydrummond/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.7148.57/IntelliJ IDEA.app.plugins/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.4.7 (Oracle Corporation Java 15.0.1)
[info] loading global plugins from /Users/harrydrummond/.sbt/1.0/plugins
[info] loading settings for project t-build from plugins.sbt ...
[info] loading project definition from /Users/harrydrummond/Documents/workspace/scala/t/project
[info] loading settings for project root from build.sbt ...
[info] set current project to ScalaTest (in build file:/Users/harrydrummond/Documents/workspace/scala/t/)
[info] sbt server started at local:///Users/harrydrummond/.sbt/1.0/server/26df0aea2f87fd17e7ab/sock
[info] started sbt server
sbt:ScalaTest>
;set _root_.scala.collection.Seq(historyPath := None,shellPrompt := { _ => "" }
,SettingKey[_root_.scala.Option[_root_.sbt.File]]
("sbtStructureOutputFile")
in _root_.sbt.Global := _root_.scala.Some(_root_.sbt.file("/private/var/folders/kf/q70qvrg10ls9qd7vvb8zlvgr0000gn/T/sbt-structure.xml"))
,SettingKey[_root_.java.lang.String]("sbtStructureOptions")
in _root_.sbt.Global := "download, resolveClassifiers")
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to ScalaTest (in build file:/Users/harrydrummond/Documents/workspace/scala/t/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /Users/harrydrummond/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.7148.57/IntelliJ IDEA.app.plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2020.3/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to ScalaTest (in build file:/Users/harrydrummond/Documents/workspace/scala/t/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.spigotmc:spigot-api:1.16.5-R0.1-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/harrydrummond/.ivy2/localorg.spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom (Server returned HTTP response code: 403 for URL: https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom) while downloading https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.spigotmc:spigot-api:1.16.5-R0.1-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/harrydrummond/.ivy2/localorg.spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom (Server returned HTTP response code: 403 for URL: https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom) while downloading https://hub.spigotmc.org/nexus/content/repositories/snapshots/org/spigotmc/spigot-api/1.16.5-R0.1-SNAPSHOT/spigot-api-1.16.5-R0.1-SNAPSHOT.pom
[error] Total time: 1 s, completed 21 Feb 2021, 16:54:13
[info] shutting down sbt server

How to run an existing Scala project using VS Code and Metals?

I am brand new to Scala and I find that Scala IDE is very slow on my machine for basic things like searching the codebase and editing code. I am used to Visual Studio Code and was very happy to find this metals extension.
I was able to "import build" and fix issues like bumping up scala version in my projects but I am not sure how to reproduce this step to set up a run configuration and actually launch our app in Scala IDE.
We have a parent folder which has a bunch of projects and a 'consoleapp' project which is the main entry point of our app - it imports the logic/routes of all other projects.
|____parent
| |____consoleapp
| |____project1
| |____project2
I tried sbt run and sbt runMain consoleapp from within the consoleapp folder and also the parent folder but they didn't work.
I am not sure what other information from our setup is relevant - happy to provide more info as needed.
Updated to add more details below:
consoleapp/build.sbt
name := "consoleapp"
version := "1.0"
scalaVersion := "2.12.10"
packMain := Map("consoleapp" -> "consoleapp")
libraryDependencies ++= Seq (...)
Output of commands I ran - sbt run and sbt runMain
Running from ~/scala/parent
> sbt run masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/project
[info] Loading settings for project consoleapp from build.sbt ...
...
Loading settings for all other projects in parent folder
...
[info] Loading settings for project parent from build.sbt ...
[info] Resolving key references (22435 settings) ...
[info] Set current project to parent (in build file:/Users/pradhyo/scala/parent/)
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] stack trace is suppressed; run last Compile / bgRun for the full output
[error] (Compile / bgRun) No main class detected.
[error] Total time: 1 s, completed 18-Dec-2019 1:41:25 PM
Running from ~/scala/parent
> sbt "runMain consoleapp.consoleapp" masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/project
[info] Loading settings for project consoleapp from build.sbt ...
...
Loading settings for all other projects in parent folder
...
[info] Loading settings for project parent from build.sbt ...
[info] Resolving key references (22435 settings) ...
[info] Set current project to parent (in build file:/Users/pradhyo/scala/parent/)
[info] running consoleapp.consoleapp
[error] (run-main-0) java.lang.ClassNotFoundException: consoleapp.consoleapp
[error] java.lang.ClassNotFoundException: consoleapp.consoleapp
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] stack trace is suppressed; run last Compile / bgRunMain for the full output
[error] Nonzero exit code: 1
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 0 s, completed 18-Dec-2019 1:46:21 PM
Running from ~/scala/parent/consoleapp
> sbt run masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/consoleapp/project
[info] Loading settings for project consoleapp from build.sbt ...
[info] Set current project to consoleapp (in build file:/Users/pradhyo/scala/parent/consoleapp/)
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] stack trace is suppressed; run last Compile / bgRun for the full output
[error] (Compile / bgRun) No main class detected.
[error] Total time: 0 s, completed 18-Dec-2019 1:49:26 PM
Running from ~/scala/parent/consoleapp
> sbt "runMain consoleapp" masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/consoleapp/project
[info] Loading settings for project consoleapp from build.sbt ...
[info] Set current project to consoleapp (in build file:/Users/pradhyo/scala/parent/consoleapp/)
[info] running consoleapp
[error] (run-main-0) java.lang.ClassNotFoundException: consoleapp
[error] java.lang.ClassNotFoundException: consoleapp
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] stack trace is suppressed; run last Compile / bgRunMain for the full output
[error] Nonzero exit code: 1
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 1 s, completed 18-Dec-2019 1:50:06 PM
After following the instructions in the Scala Metals VSCode Readme, use a launch configuration similar to this for the Eclipse screenshots in the question.
.vscode/launch.json
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "scala",
"name": "Debug consoleapp",
"request": "launch",
"mainClass": "consoleapp",
"buildTarget": "consoleapp",
"args": [],
"jvmOptions": ["-J-Dconfig.file=/path/to/config/file"]
}
]
}
I had trouble passing the config file for pureconfig correctly. Here's the Github issue with the correct jvmOptions line.
Metals supports these Scala versions 2.13.0, 2.13.1, 2.12.8, 2.12.9,
2.12.10, 2.12.7 and 2.11.12
This is something I noticed from their documentation(doc). So it could pinpoint to your and mine problem which I addressed in comments.
Your scalaVersion := "2.12.3" is not listed here.

java.lang.RuntimeException: You must run the `stage` task before deploying your app when running `sbt stage deployHeroku`

I am trying to deploy my application to Heroku using sbt-nativepackager and sbt-heroku.
My code is available at https://github.com/hhimanshu/sbt101/tree/m5 (branch is m5)
When I run sbt stage deployHeroku, the application fails as below
➜ sbt101 git:(m5) ✗ sbt stage deployHeroku
[info] Loading global plugins from /Users/harit/.sbt/1.0/plugins
[info] Loading settings for project sbt101-build from plugins.sbt ...
[info] Loading project definition from /Users/harit/code/sc/sbt101/project
[info] Loading settings for project root from build.sbt ...
[info] Set current project to sbt101 (in build file:/Users/harit/code/sc/sbt101/)
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT-sources.jar ...
[info] Done packaging.
[info] Wrote /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT.pom
[info] Wrote /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT.pom
[info] Main Scala API documentation to /Users/harit/code/sc/sbt101/api/target/scala-2.12/api...
[info] Compiling 1 Scala source to /Users/harit/code/sc/sbt101/api/target/scala-2.12/classes ...
model contains 3 documentable templates
[info] Done compiling.
[info] Main Scala API documentation successful.
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT-javadoc.jar ...
[info] Packaging /Users/harit/code/sc/sbt101/api/target/scala-2.12/api_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] Main Scala API documentation to /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/api...
[info] Compiling 1 Scala source to /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/classes ...
[info] Done packaging.
[warn] there was one feature warning; re-run with -feature for details
model contains 5 documentable templates
[warn] one warning found
[info] Main Scala API documentation successful.
[info] Packaging /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT-javadoc.jar ...
[info] Done packaging.
[warn] there was one deprecation warning (since 2.11.0); re-run with -deprecation for details
[warn] there was one feature warning; re-run with -feature for details
[warn] two warnings found
[info] Done compiling.
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Packaging /Users/harit/code/sc/sbt101/calculators/target/scala-2.12/calculators_2.12-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 4 s, completed 10-May-2019 4:20:03 PM
[error] java.lang.RuntimeException: You must run the `stage` task before deploying your app!
[error] at com.heroku.sbt.SbtApp.packageType(SbtApp.scala:142)
[error] at com.heroku.sbt.SbtApp.prepare(SbtApp.scala:111)
[error] at com.heroku.sdk.deploy.App.deploy(App.java:60)
[error] at com.heroku.sbt.SbtApp.deploy(SbtApp.scala:98)
[error] at com.heroku.sbt.HerokuPlugin$autoImport$.$anonfun$baseHerokuSettings$1(HerokuPlugin.scala:53)
[error] at com.heroku.sbt.HerokuPlugin$autoImport$.$anonfun$baseHerokuSettings$1$adapted(HerokuPlugin.scala:26)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:40)
[error] at sbt.std.Transform$$anon$4.work(System.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:269)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:278)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:269)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
2019-05-10 16:20:03,606 Log4j2-TF-1-AsyncLogger[AsyncContext#cb644e]-1 ERROR Attempted to append to non-started appender heroku-logger
[error] (Compile / deployHeroku) You must run the `stage` task before deploying your app!
2019-05-10 16:20:03,607 Log4j2-TF-1-AsyncLogger[AsyncContext#cb644e]-1 ERROR Attempted to append to non-started appender heroku-logger
[error] Total time: 0 s, completed 10-May-2019 4:20:03 PM
However, using the Heroku toolbelt on command-line, I have been successful in deploying my app
➜ sbt101 git:(m5) ✗ git push heroku m5:master
The the app runs at https://h2-sbt101.herokuapp.com/rates
Can someone please help me understand what I may be missing?
I had the same problem. sbt deployHeroku looks for the directory target/universal/stage (see the source). However, it seems to look for it in the root project which may not be the one with the staged directory. For example, in the OP's log, it seems there are several projects called api and calculator. In my case, the correct one (containing the server code) was server.
So sbt stage server/deployHeroku worked for me.

How to declare dependency on Scalding in sbt project?

I am trying to figure out how to create an build.sbt file for my own Scalding-based project.
Scalding source structure has no build.sbt file. Instead it has project/Build.scala build definition.
What would be the right way to integrate my own sbt project with Scalding, so I could also import it later in Eclipse with sbt-eclipse plugin?
Update:
For the following code:
import cascading.tuple.Fields
import com.twitter.scalding._
class Scan(args: Args) extends Job(args) {
val output = TextLine("tmp/out.txt")
val wordsList = List(
("john"),
("liza"),
("nina"),
("x"))
val orderedPipe =
IterableSource[(String)](wordsList, ('word))
.debug
.write(output)
}
With this build.sbt:
name := "Scan"
version := "1.0"
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
I get errors:
$ sbt
[info] Loading global plugins from /home/test/.sbt/0.13/plugins
[info] Set current project to Scan (in build file:/home/test/Cascading/Scala/Scan/)
> compile
[info] Updating {file:/home/test/Cascading/Scala/Scan/}scan...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] downloading http://repo1.maven.org/maven2/com/twitter/scalding_2.10/0.11.1/scalding_2.10-0.11.1.jar ...
[info] [SUCCESSFUL ] com.twitter#scalding_2.10;0.11.1!scalding_2.10.jar (641ms)
[info] Done updating.
[info] Compiling 1 Scala source to /home/test/Cascading/Scala/Scan/target/scala-2.10/classes...
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:1: not found: object cascading
[error] import cascading.tuple.Fields
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:2: object twitter is not a member of package com
[error] import com.twitter.scalding._
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: not found: type Job
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: not found: type Args
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: too many arguments for constructor Object: ()Object
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:6: not found: value TextLine
[error] val output = TextLine("tmp/out.txt")
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:15: not found: value IterableSource
[error] IterableSource[(String)](wordsList, ('word))
[error] ^
[error] 7 errors found
[error] (compile:compile) Compilation failed
Update 2
After doing git clone git#github.com:twitter/scalding.git their repository and sbt publishLocal I still have the same compilation errors.
BUT adding two lines that you suggested to build.sbt allowed me to compile my code. So the following build.sbt really works, thanks!
name := "BlockScan"
version := "1.0"
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
lazy val scaldingCore = ProjectRef(uri("https://github.com/twitter/scalding.git"), "scalding-core")
lazy val myProject = project in file(".") dependsOn scaldingCore
'sbt eclipse' creates Eclipse project wich does not compile under Eclipse and reports these errors:
Project 'Scan' is missing required Java project: 'scalding-core'
More than one scala library found in the build path (/home/test/usr/eclipse-scala-3.0.3/configuration/org.eclipse.osgi/bundles/290/1/.cp/lib/scala-library.jar, /home/test/wks/Cascading/Scala/scalding/target/scala-2.9.3/scalding-assembly-0.10.0.jar).At least one has an incompatible version. Please update the project build path so it contains only compatible scala libraries.
scalacheck_2.9.3-1.10.0.jar is cross-compiled with an incompatible version of Scala (2.9.3).
specs_2.9.3-1.6.9.jar is cross-compiled with an incompatible version of Scala (2.9.3).
Since they don't seem to publish their libraries to remote repositories where you could pull down the necessary dependencies, you'll have to declare the source dependency on the GitHub repository for the project.
lazy val scaldingCore = ProjectRef(uri("https://github.com/twitter/scalding.git"), "scalding-core")
lazy val myProject = project in file(".") dependsOn scaldingCore
With the above build definition, sbt will git clone the RootProject and load the build.
➜ scalding xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
Cloning into '/Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding'...
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project/}scalding-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] downloading http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.eed3si9n/sbt-assembly/scala_2.10/sbt_0.13/0.10.2/jars/sbt-assembly.jar ...
[info] [SUCCESSFUL ] com.eed3si9n#sbt-assembly;0.10.2!sbt-assembly.jar (3600ms)
[info] Done updating.
[info] Compiling 3 Scala sources to /Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 8 deprecation warning(s); re-run with -deprecation for details
[warn] there were 2 feature warning(s); re-run with -feature for details
[warn] two warnings found
[info] Set current project to myProject (in build file:/Users/jacek/sandbox/scalding/)
> projects
[info] In file:/Users/jacek/sandbox/scalding/
[info] * myProject
[info] In https://github.com/twitter/scalding.git
[info] maple
[info] scalding
[info] scalding-args
[info] scalding-avro
[info] scalding-commons
[info] scalding-core
[info] scalding-date
[info] scalding-hadoop-test
[info] scalding-jdbc
[info] scalding-json
[info] scalding-parquet
[info] scalding-repl
The build setup should give you access to scalding classes.
> console
[info] Starting scala interpreter...
[info]
Welcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_60).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import com.twitter.scalding._
import com.twitter.scalding._
And the Scan class compiles fine - it's in src/main/scala directory.
> show sources
[info] ArrayBuffer(/Users/jacek/sandbox/scalding/src/main/scala/Scan.scala)
[success] Total time: 0 s, completed Jul 15, 2014 12:21:14 AM
> compile
[info] Updating {file:/Users/jacek/sandbox/scalding/}myProject...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/scalding/target/scala-2.10/classes...
[success] Total time: 4 s, completed Jul 15, 2014 12:21:20 AM
You could also git clone git#github.com:twitter/scalding.git their repository and sbt publishLocal to be able to declare binary dependency in build.sbt as follows:
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
With the dependency in (either way), execute sbt eclipse and be done with it!

Using scala-eclipse for spark

Could some please help me on how to use the scala-eclipse IDE for spark ?
I came across this link - http://syndeticlogic.net/?p=311. But I am unable to follow it.
I entered the command - mvn -Phadoop2 eclipse:clean eclipse:eclipse inside the spark directory after a long list of downloads it gave me some error. Please help. Thanks
Below is the error i received
Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [5:22.386s]
[INFO] Spark Project Core ................................ SUCCESS [17:20.807s]
[INFO] Spark Project Bagel ............................... FAILURE [2.159s]
[INFO] Spark Project GraphX .............................. SKIPPED
[INFO] Spark Project ML Library .......................... SKIPPED
[INFO] Spark Project Streaming ........................... SKIPPED
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Catalyst ............................ SKIPPED
[INFO] Spark Project SQL ................................. SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:15.115s
[INFO] Finished at: Wed May 07 15:27:51 GMT+05:30 2014
[INFO] Final Memory: 22M/81M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop2" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on project spark-bagel_2.10: Failed to resolve dependencies for one or more projects in the reactor. Reason: Missing:
[ERROR] ----------
[ERROR] 1) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] Try downloading the file manually from the project website.
[ERROR]
[ERROR] Then, install it using the command:
[ERROR] mvn install:install-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file
[ERROR]
[ERROR] Alternatively, if you host your own repository you can deploy the file there:
[ERROR] mvn deploy:deploy-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR]
[ERROR] Path to dependency:
[ERROR] 1) org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 2) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] ----------
[ERROR] 1 required artifact is missing.
[ERROR]
[ERROR] for artifact:
[ERROR] org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] from the specified remote repositories:
[ERROR] maven-repo (http://repo.maven.apache.org/maven2, releases=true, snapshots=false),
[ERROR] apache-repo (https://repository.apache.org/content/repositories/releases, releases=true, snapshots=false),
[ERROR] jboss-repo (https://repository.jboss.org/nexus/content/repositories/releases, releases=true, snapshots=false),
[ERROR] mqtt-repo (https://repo.eclipse.org/content/repositories/paho-releases, releases=true, snapshots=false),
[ERROR] cloudera-repo (https://repository.cloudera.com/artifactory/cloudera-repos, releases=true, snapshots=false),
[ERROR] apache.snapshots (http://repository.apache.org/snapshots, releases=false, snapshots=true),
[ERROR] central (http://repo.maven.apache.org/maven2, releases=true, snapshots=false)
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-bagel_2.10
This is because there is no profile called hadoop2 in the pom.xml. The closest matches are hadoop-2.2,hadoop-2.3 etc.
You can run the following
mvn -Phadoop-2.2 eclipse:clean eclipse:eclipse
or you may run 'mvn help:all-profiles' to list all the profiles and use one from it
If you want to contribute to the Apache Spark project, then
Go to spark home and run sbt/sbt eclipse
In Scala IDE, Select File | Import | Existing Projects into Workspace.
Select root directory :MY_SPARK_HOME
Select Search for Nested Projects
Select the projects that you want
Do not select "Copy projects into workspace".
If you want to use the spark libraries in an application that you are using then,
- You can create a jar using the sbt/sbt assembly command and then add that jar as a library to your application project
Also refer to the eclipse documentation here at: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-Eclipse