Can't find Spark libraries when using Eclipse - eclipse

I used to code Scala within the terminal, but now I'm trying
with the ScalaIDE for Eclipse.
However, I've got one big problem:
error: not found: value sc
I've tried to add those libraries
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
But then it displays:
object apache is not a member of package org
So I don't know what to do....
In my IntelliJ project my build.sbt is pretty empty:
name := "test"
version := "1.0"
scalaVersion := "2.11.7"

is it a sbt project? make sure you have eclipse plugin for scala/sbt, and import it as a sbt project.
also, add the dependency on the build.sbt
nevertheless, I prefer Intellij :)

Related

How to correctly import SBT Build.scala in scala-ide

As an sbt build can be written in scala and is itself a scala project, i would like to import it in scala-ide as a scala project. For example with the following code.
Build.scala
import sbt._
import Keys._
object TestBuild extends Build {
lazy val root = Project(id = "test",
base = file("."),
settings = Seq(
organization := "com.tomahna",
name := "demo",
scalaVersion := "2.11.8"))
}
plugins.sbt
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
This build works fine with sbt, however Build.scala is not compiled by eclipse, thus i get neither compilation errors nor auto-completion.
I can add the project folder to source folders but then import sbt._ and import Keys._ will fail because the eclipse project is not correctly set to provide these dependencies.
Is there a way to setup the sbt project so that it interact nicely with scala-IDE ?
From sbteclipse manual:link
If you want to get Eclipse support for the sbt build definition, e.g. for your Build.scala file, follow these steps:
If you are not using sbteclipse as as global plugin, which is the recommended way, but as a local plugin for your project, you first have to add sbteclipse as a plugin (addSbtPlugin(...)) to the build definition project, i.e. to project/project/plugins.sbt
In the sbt session, execute reload plugins
Set the name of the build definition project to something meaningful: set name := "sbt-build"
Execute eclipse and then reload return
Import the build definition project into Eclipse and add the root directory to the build path

Eclipse Scala IDE code not compiling

I downloaded eclipse scala ide from scala-ide.org site and trying to compile my first scala word count program. But its gives error "object not a member of package org" in the following import command
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
After some research I found that I need to add the jar file spark-assembly-1.0.0-hadoop2.2.0.jar to overcome this issue
But after doing lot of research I could not locate this jar. Can anyone help here ?
Install the SBT Scala build+dependency tool
Create an empty directory. Name it spark-test or whatever you want to name your project.
Put your source code in the sub-directory src/scala/main. If you have Main.scala in package scalatest, it should be src/scala/main/scalatest/Main.scala
Make a build.sbt file with the following contents
name := """sparktest"""
version := "1.0-SNAPSHOT"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.4.0"
)
Configure the SBT Eclipse plugin. Create ~/.sbt/0.13/plugins/plugins.sbt, with:
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
Generate an Eclipse project with sbt eclipse
Import your project into eclipse.
Run :)
Scala is not a simple language/env to learn. It is important you learn how scala works and then move into spark.
There are tons of material available on web. A proper learning path will be to learn
SBT > SCALA > Use Scala for Spark
The dependency that you mentioned, can be put in he sbt's build.sbt. You can also use maven, but I recommend learning sbt as way of learning scala. Once you have resolved, the dependency using SBT, your simple code should work fine. But still, I recommend doing a "hello world" first than doing a "word count" :-)
Ando to answer your question, in your SBT you should be adding following library,
libraryDependencies += "org.apache.spark" % "spark-assembly_2.10" % "1.1.1"
This was for spark assembly 1.1.1 for hadoop 2.10. I see you need a different version, you can find the proper version at
Maven Repo details for spark/hadoop
Here's the pure eclipse solutions (I had to download and setup eclipse just to answer this question)
Get Scala IDE (it comes with inbuilt Scala Compiler version 2.10.5 and 2.11.6)
Create a new project Scala Wizard > Scala Project
Right click "src" in the scale project , select "Scala Object", give it a name - I gave WordCount
Right click project > Configure > Convert to Maven Project
In the body of word count object (I named the object as WordCount) paste the text from Apache Spark Example which finally looks like
```
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object WordCount {
val sparkConf = new SparkConf().setAppName("SampleTest")
val spark = new SparkContext(sparkConf)
val textFile = spark.textFile("hdfs://...")
val counts = textFile.flatMap(line => line.split(" "))
.map(word => (word, 1))
.reduceByKey(_ + _)
counts.saveAsTextFile("hdfs://...")
}
```
6. Add following to your maven
```
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
```
Right click on "Scala Library Container", and select "Latest 2.10 bundle", click ok
Once I was done with these, there was no error message displayed on my "problems" list of Eclipse...indicating that it compiled as expected.
Obviously, this example won't run as I haven't provided enough info for it to run...but this was just to answer to the question, "how to get the code compiled".
Hope this helps.

Using a custom sbt plugin

I've created a new scala project and written an AutoPlugin underneath it in a src/main/scala/com/company/plugin directory and a corresponding namespace. The plugin code is a cut and paste of HelloPlugin3 from [1], however I have changed the names.
Then, in a second sbt project I've updated the project/plugins.sbt file to include my new Hello World plugin. This second project has other 'business code' in. When I run sbt in that second project, the plugin is resolved and I've tested that by deleting the jar from underneath the ~/.ivy/local/... and then reloading the project and witnessing sbt complain that it can't find the plugin. When I publishLocal my plugin project again, that error goes away.
So I'm happy that the plugin is resolved and the jar file is not empty because I have checked its contents.
However, when I do an sbt> about my custom plugin isn't listed and the command I was expecting to be available isn't. ("[error] Not a valid command: hello"). But the other plugin I list in plugins.sbt (io.spray sbt-revolver) does appear in the output.
Both the plugin project and the second project have scalaVersion := "2.10.3" specified in their build.sbt files.
I'm using sbt 0.13.6. Interestingly, and perhaps related, is the sbt command plugins is not apparently valid in this project either, although it works just fine in the plugin project.
What extra step am I missing to make the command available to my second project? How do I check to see if I've got some particularly messed up sbt config happening?
For convenience, the plugin code is below, but as mentioned, it's a copy from the link underneath it.
package com.company.plugin
import sbt._
import Keys._
object HelloPlugin extends AutoPlugin {
object autoImport {
val greeting = settingKey[String]("greeting")
}
import autoImport._
override def trigger = allRequirements
override lazy val buildSettings = Seq(
greeting := "Hi",
commands += helloCommand)
lazy val helloCommand =
Command.command("hello") { (state: State) =>
println("fred")
state
}
}
Edit:
The build.sbt for the plugin project as as follows;
sbtPlugin := true
scalaVersion := "2.10.3"
organization := "com.company"
name := "name"
version := "0.0.1-SNAPSHOT"
The only other file I've created in this project is the .scala file for the plugin itself.
[1] http://www.scala-sbt.org/release/docs/Plugins.html
The problem ended up being with the project/build.properties of the project trying to use the plugin. This file set the sbt version to 0.13.1, which for some reason cases both my plugin and the sbt plugins command to not work.
Changing the value to 0.13.6 made all the problems go away.

Playframework samples didn't build

On Macos Maverics, I have following issue:
I cloned Playframework repository and wanted to build the samples. Unfortunately it doesn't work. I have playframework on my path, play-2.2.3.
I tried to import it from Intellij IDEA but the same error.
localhost:helloworld radimpavlicek$ play
[info] Loading project definition from /Users/radimpavlicek/Documents/playframework/samples/scala/helloworld/project
/Users/radimpavlicek/Documents/playframework/samples/scala/helloworld/build.sbt:5: error: not found: value PlayScala
lazy val root = (project in file(".")).enablePlugins(PlayScala)
`` ^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
I just had the same problem in Ubuntu 14.04 trying to build the zentask sample app. I eliminated the offending line in the build.sbt and added play.Project.playScalaSettings to the end and was able to compile and run from the Play console. This is my current zentask build.sbt (empty lines are needed):
name := "zentask"
version := "1.0"
libraryDependencies ++= Seq(jdbc, anorm)
scalaVersion := Option(System.getProperty("scala.version")).getOrElse("2.10.4")
play.Project.playScalaSettings
The application build has changed from Play 2.2.x to 2.3.x, and your build.sbt appears to be in 2.3 format. If you've checked out the helloworld project from Github, make sure you're on the 2.2.x branch or otherwise upgrade Play to 2.3.0-RC1 (the latest as of this writing.) In 2.2, the build.sbt for the helloworld sample consists of this in its entirety:
import play.Project._
name := "helloworld"
version := "1.0"
playScalaSettings
enablePlugins isn't available in sbt.Project until sbt 0.13.5.
You can update your sbt version following this link http://www.scala-sbt.org/download.html and instead of using install, use update/upgrade(depends on you package manager)

Package visible in Scala REPL but not in Eclipse in SBT project?

I have the following line in my build.sbt:
libraryDependencies += "org.bouncycastle" % "bcprov-jdk16" % "1.46"
When I go to REPL and launch my project there, the following works:
scala> import org.bouncycastle.jce.provider.BouncyCastleProvider
import org.bouncycastle.jce.provider.BouncyCastleProvider
scala> val a = new BouncyCastleProvider
a: org.bouncycastle.jce.provider.BouncyCastleProvider = BC version 1.46
But when I try to import the same package in Eclipse I get an error:
import org.bouncycastle.jce.provider.BouncyCastleProvider
// object bouncycastle is not a member of package org
Why is this happening?
Have you tried running sbt eclipse? That should create Eclipse project files, .classpath among them as well, which contains paths to dependencies.
Unless you use a version of Eclipse with support for dependencies under sbt, you'll have to execute sbt eclipse every time you've changed them.