Build.scala:1: not found: object sbt - scala

I am using sbt version 1.0
$ sbt version
[info] Loading project definition from /Users/harit/code/learningScala/project
[info] Set current project to learningScala (in build file:/Users/harit/code/learningScala/)
[info] 1.0
I am using IntelliJ IDEA v14.1.3 for my project and the structure looks like
As you may see that project was not able to resolve Build. When I try command-line to run sbt, I see
$ sbt
[info] Loading project definition from /Users/harit/code/learningScala/project
[info] Set current project to learningScala (in build file:/Users/harit/code/learningScala/)
> compile
[info] Updating {file:/Users/harit/code/learningScala/}learningscala...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/harit/code/learningScala/target/scala-2.11/classes...
[error] /Users/harit/code/learningScala/Build.scala:1: not found: object sbt
[error] import sbt.Build
[error] ^
[error] /Users/harit/code/learningScala/Build.scala:3: not found: type Build
[error] object MyBuild extends Build {
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 28, 2015 8:10:37 PM
>
I am very new to Scala, sbt so no idea what is going wrong with it

The MyBuild.scala was at root. It should be inside project folder. I made that change and now it works. Thanks to tpolecat on IRC who helped me with this
> compile
[success] Total time: 0 s, completed May 28, 2015 8:20:57 PM
> compile
[info] Updating {file:/Users/harit/code/learningScala/}learningscala...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[success] Total time: 0 s, completed May 28, 2015 8:21:22 PM
>

I have a multi-project sbt build, and two of the projects were declaring two of the same val in their build.sbt files. I moved the duplicated val (s) to the Build.scala in the root project as def, and error stopped. Found the answer here: https://github.com/sbt/sbt/issues/1465
example: val samza_gid = "org.apache.samza" in build.sbt file became
def samza_gid = "org.apache.samza" in the Build.scala file.

Related

Facing issue with sbt dependency

I try to use the phoenix-spark jar to laod phoenix table to Spark 2.2.3 DataFrame
adding this dependency:
libraryDependencies += "org.apache.phoenix" % "phoenix-spark2" % "4.7.0.2.6.5.1102-5"
I tested this two resolvers one by one:
resolvers += "Hortonworks Repository" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/groups/public/"
I had the folowing error:
[info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_261)
[info] loading project definition from /home/ambac61n/IdeaProjects/phoenix_test/project
[info] loading settings for project phoenix_test from build.sbt ...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[info] sbt server started at local:///home/ambac61n/.sbt/1.0/server/0c2856c06fe3f2cf2706/sock
sbt:phoenix_test>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/ambac61n/IdeaProjects/phoenix_test/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /home/my_user/.local/share/JetBrains/IdeaIC2020.2/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to phoenix_test (in build file:/home/my_user/IdeaProjects/phoenix_test/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.phoenix:phoenix-spark2:4.7.0.2.6.5.1102-5
[error] Not found
[error] Not found
[error] not found: /home/ambac61n/.ivy2/local/org.apache.phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] not found: http://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/4.7.0.2.6.5.1102-5/phoenix-spark2-4.7.0.2.6.5.1102-5.pom
[error] Total time: 3 s, completed 22 août 2020 05:56:14
[info] shutting down sbt server
Do you have any Idea?
After vising those repositories, I noticed that there is no package, indeed.
For the first repository,
https://repo1.maven.org/maven2/org/apache/phoenix/
No package for the phoenix-spark2
and the second repository,
https://repo.hortonworks.com/content/repositories/releases/org/apache/phoenix/phoenix-spark2/
No package for the 4.7.0.2.6.5.1102-5 version
Try with the other versions.

How to run an existing Scala project using VS Code and Metals?

I am brand new to Scala and I find that Scala IDE is very slow on my machine for basic things like searching the codebase and editing code. I am used to Visual Studio Code and was very happy to find this metals extension.
I was able to "import build" and fix issues like bumping up scala version in my projects but I am not sure how to reproduce this step to set up a run configuration and actually launch our app in Scala IDE.
We have a parent folder which has a bunch of projects and a 'consoleapp' project which is the main entry point of our app - it imports the logic/routes of all other projects.
|____parent
| |____consoleapp
| |____project1
| |____project2
I tried sbt run and sbt runMain consoleapp from within the consoleapp folder and also the parent folder but they didn't work.
I am not sure what other information from our setup is relevant - happy to provide more info as needed.
Updated to add more details below:
consoleapp/build.sbt
name := "consoleapp"
version := "1.0"
scalaVersion := "2.12.10"
packMain := Map("consoleapp" -> "consoleapp")
libraryDependencies ++= Seq (...)
Output of commands I ran - sbt run and sbt runMain
Running from ~/scala/parent
> sbt run masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/project
[info] Loading settings for project consoleapp from build.sbt ...
...
Loading settings for all other projects in parent folder
...
[info] Loading settings for project parent from build.sbt ...
[info] Resolving key references (22435 settings) ...
[info] Set current project to parent (in build file:/Users/pradhyo/scala/parent/)
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] stack trace is suppressed; run last Compile / bgRun for the full output
[error] (Compile / bgRun) No main class detected.
[error] Total time: 1 s, completed 18-Dec-2019 1:41:25 PM
Running from ~/scala/parent
> sbt "runMain consoleapp.consoleapp" masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/project
[info] Loading settings for project consoleapp from build.sbt ...
...
Loading settings for all other projects in parent folder
...
[info] Loading settings for project parent from build.sbt ...
[info] Resolving key references (22435 settings) ...
[info] Set current project to parent (in build file:/Users/pradhyo/scala/parent/)
[info] running consoleapp.consoleapp
[error] (run-main-0) java.lang.ClassNotFoundException: consoleapp.consoleapp
[error] java.lang.ClassNotFoundException: consoleapp.consoleapp
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] stack trace is suppressed; run last Compile / bgRunMain for the full output
[error] Nonzero exit code: 1
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 0 s, completed 18-Dec-2019 1:46:21 PM
Running from ~/scala/parent/consoleapp
> sbt run masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/consoleapp/project
[info] Loading settings for project consoleapp from build.sbt ...
[info] Set current project to consoleapp (in build file:/Users/pradhyo/scala/parent/consoleapp/)
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] stack trace is suppressed; run last Compile / bgRun for the full output
[error] (Compile / bgRun) No main class detected.
[error] Total time: 0 s, completed 18-Dec-2019 1:49:26 PM
Running from ~/scala/parent/consoleapp
> sbt "runMain consoleapp" masterstate [0a8dab85] modified
[info] Loading settings for project global-plugins from metals.sbt,build.sbt ...
[info] Loading global plugins from /Users/pradhyo/.sbt/1.0/plugins
[info] Loading project definition from /Users/pradhyo/scala/parent/consoleapp/project
[info] Loading settings for project consoleapp from build.sbt ...
[info] Set current project to consoleapp (in build file:/Users/pradhyo/scala/parent/consoleapp/)
[info] running consoleapp
[error] (run-main-0) java.lang.ClassNotFoundException: consoleapp
[error] java.lang.ClassNotFoundException: consoleapp
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] stack trace is suppressed; run last Compile / bgRunMain for the full output
[error] Nonzero exit code: 1
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 1 s, completed 18-Dec-2019 1:50:06 PM
After following the instructions in the Scala Metals VSCode Readme, use a launch configuration similar to this for the Eclipse screenshots in the question.
.vscode/launch.json
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "scala",
"name": "Debug consoleapp",
"request": "launch",
"mainClass": "consoleapp",
"buildTarget": "consoleapp",
"args": [],
"jvmOptions": ["-J-Dconfig.file=/path/to/config/file"]
}
]
}
I had trouble passing the config file for pureconfig correctly. Here's the Github issue with the correct jvmOptions line.
Metals supports these Scala versions 2.13.0, 2.13.1, 2.12.8, 2.12.9,
2.12.10, 2.12.7 and 2.11.12
This is something I noticed from their documentation(doc). So it could pinpoint to your and mine problem which I addressed in comments.
Your scalaVersion := "2.12.3" is not listed here.

Scalc SBT compile failing

I am trying to follow code from below link
http://spark.apache.org/docs/latest/quick-start.html
But when I am trying to create package it is failing. I want to know 2 thinks
obviously, why it is failing
why it is showing older version of the scala, while I mentioned 2.11
Below is the error message.
[info] Set current project to default-0464ce (in build file:/home/ubuntu/simple_sbt/)
[info] Updating {file:/home/ubuntu/simple_sbt/}default-0464ce...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/ubuntu/simple_sbt/target/scala-2.9.1/classes...
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:1: object apache is not a member of package org
[error] import org.apache.spark.SparkContext
[error] ^
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:2: object apache is not a member of package org
[error] import org.apache.spark.SparkContext._
[error] ^
[error] two errors found
[error] {file:/home/ubuntu/simple_sbt/}default-0464ce/compile:compile: Compilation failed
[error] Total time: 2 s, completed Aug 30, 2016 3:19:18 AM
when you run sbt package , sometimes it fails as there are no dependencies that are downloaded and will be resolved for the files imported.
Try running, sbt run first and then sbt package . sbt run should bring in all the dependencies on top of which packaging and compiling will be possible.
If the above does not solves your problem, you need to share your sbt build file and the environment that you are using. In which directory you are running these commands, will also play a role.

Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2

While compiling the Maven project the following error occured:
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.992s
[INFO] Finished at: Fri Apr 15 17:44:33 CEST 2016
[INFO] Final Memory: 25M/350M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
I removed the property <useZincServer>true</useZincServer> from pom.xml, and still the Logging error persists.
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.814s
[INFO] Finished at: Fri Apr 15 17:41:00 CEST 2016
[INFO] Final Memory: 25M/335M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
I checked that PATH and JAVA_HOME are defined in ~/.bashrc as follows:
export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
The only issue that I noticed is that echo $JAVA_HOME gives an empty output, though I did source ~/.bashrc.
Any help is highly appreciated.
The problem could be this [INFO] Using incremental compilation
In your pom.xml try to remove the line <recompileMode>incremental</recompileMode>
and then try again.
It is strange echo $JAVA_HOME gives an empty output.While compling the Spakr Source,I import the mvn clean package success porject into eclipse,I meet the same problem.And I found the solution here:
How to solve “Plugin execution not covered by lifecycle configuration” for Spring Data Maven Builds
I think you were compiling Spark with Scala 2.10. If so, you should do as follows.
cd /path/to/Spark
./dev/change-scala-version.sh 2.10
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
Hope this helps.

How to declare dependency on Scalding in sbt project?

I am trying to figure out how to create an build.sbt file for my own Scalding-based project.
Scalding source structure has no build.sbt file. Instead it has project/Build.scala build definition.
What would be the right way to integrate my own sbt project with Scalding, so I could also import it later in Eclipse with sbt-eclipse plugin?
Update:
For the following code:
import cascading.tuple.Fields
import com.twitter.scalding._
class Scan(args: Args) extends Job(args) {
val output = TextLine("tmp/out.txt")
val wordsList = List(
("john"),
("liza"),
("nina"),
("x"))
val orderedPipe =
IterableSource[(String)](wordsList, ('word))
.debug
.write(output)
}
With this build.sbt:
name := "Scan"
version := "1.0"
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
I get errors:
$ sbt
[info] Loading global plugins from /home/test/.sbt/0.13/plugins
[info] Set current project to Scan (in build file:/home/test/Cascading/Scala/Scan/)
> compile
[info] Updating {file:/home/test/Cascading/Scala/Scan/}scan...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] downloading http://repo1.maven.org/maven2/com/twitter/scalding_2.10/0.11.1/scalding_2.10-0.11.1.jar ...
[info] [SUCCESSFUL ] com.twitter#scalding_2.10;0.11.1!scalding_2.10.jar (641ms)
[info] Done updating.
[info] Compiling 1 Scala source to /home/test/Cascading/Scala/Scan/target/scala-2.10/classes...
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:1: not found: object cascading
[error] import cascading.tuple.Fields
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:2: object twitter is not a member of package com
[error] import com.twitter.scalding._
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: not found: type Job
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: not found: type Args
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:5: too many arguments for constructor Object: ()Object
[error] class Scan(args: Args) extends Job(args) {
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:6: not found: value TextLine
[error] val output = TextLine("tmp/out.txt")
[error] ^
[error] /home/test/Cascading/Scala/Scan/src/main/scala/Scan.scala:15: not found: value IterableSource
[error] IterableSource[(String)](wordsList, ('word))
[error] ^
[error] 7 errors found
[error] (compile:compile) Compilation failed
Update 2
After doing git clone git#github.com:twitter/scalding.git their repository and sbt publishLocal I still have the same compilation errors.
BUT adding two lines that you suggested to build.sbt allowed me to compile my code. So the following build.sbt really works, thanks!
name := "BlockScan"
version := "1.0"
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
lazy val scaldingCore = ProjectRef(uri("https://github.com/twitter/scalding.git"), "scalding-core")
lazy val myProject = project in file(".") dependsOn scaldingCore
'sbt eclipse' creates Eclipse project wich does not compile under Eclipse and reports these errors:
Project 'Scan' is missing required Java project: 'scalding-core'
More than one scala library found in the build path (/home/test/usr/eclipse-scala-3.0.3/configuration/org.eclipse.osgi/bundles/290/1/.cp/lib/scala-library.jar, /home/test/wks/Cascading/Scala/scalding/target/scala-2.9.3/scalding-assembly-0.10.0.jar).At least one has an incompatible version. Please update the project build path so it contains only compatible scala libraries.
scalacheck_2.9.3-1.10.0.jar is cross-compiled with an incompatible version of Scala (2.9.3).
specs_2.9.3-1.6.9.jar is cross-compiled with an incompatible version of Scala (2.9.3).
Since they don't seem to publish their libraries to remote repositories where you could pull down the necessary dependencies, you'll have to declare the source dependency on the GitHub repository for the project.
lazy val scaldingCore = ProjectRef(uri("https://github.com/twitter/scalding.git"), "scalding-core")
lazy val myProject = project in file(".") dependsOn scaldingCore
With the above build definition, sbt will git clone the RootProject and load the build.
➜ scalding xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
Cloning into '/Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding'...
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project/}scalding-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] downloading http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.eed3si9n/sbt-assembly/scala_2.10/sbt_0.13/0.10.2/jars/sbt-assembly.jar ...
[info] [SUCCESSFUL ] com.eed3si9n#sbt-assembly;0.10.2!sbt-assembly.jar (3600ms)
[info] Done updating.
[info] Compiling 3 Scala sources to /Users/jacek/.sbt/0.13/staging/e1da2accb95841ffb1df/scalding/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 8 deprecation warning(s); re-run with -deprecation for details
[warn] there were 2 feature warning(s); re-run with -feature for details
[warn] two warnings found
[info] Set current project to myProject (in build file:/Users/jacek/sandbox/scalding/)
> projects
[info] In file:/Users/jacek/sandbox/scalding/
[info] * myProject
[info] In https://github.com/twitter/scalding.git
[info] maple
[info] scalding
[info] scalding-args
[info] scalding-avro
[info] scalding-commons
[info] scalding-core
[info] scalding-date
[info] scalding-hadoop-test
[info] scalding-jdbc
[info] scalding-json
[info] scalding-parquet
[info] scalding-repl
The build setup should give you access to scalding classes.
> console
[info] Starting scala interpreter...
[info]
Welcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_60).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import com.twitter.scalding._
import com.twitter.scalding._
And the Scan class compiles fine - it's in src/main/scala directory.
> show sources
[info] ArrayBuffer(/Users/jacek/sandbox/scalding/src/main/scala/Scan.scala)
[success] Total time: 0 s, completed Jul 15, 2014 12:21:14 AM
> compile
[info] Updating {file:/Users/jacek/sandbox/scalding/}myProject...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/scalding/target/scala-2.10/classes...
[success] Total time: 4 s, completed Jul 15, 2014 12:21:20 AM
You could also git clone git#github.com:twitter/scalding.git their repository and sbt publishLocal to be able to declare binary dependency in build.sbt as follows:
libraryDependencies := Seq("com.twitter" %% "scalding" % "0.11.1")
With the dependency in (either way), execute sbt eclipse and be done with it!