Package visible in Scala REPL but not in Eclipse in SBT project? - eclipse

I have the following line in my build.sbt:
libraryDependencies += "org.bouncycastle" % "bcprov-jdk16" % "1.46"
When I go to REPL and launch my project there, the following works:
scala> import org.bouncycastle.jce.provider.BouncyCastleProvider
import org.bouncycastle.jce.provider.BouncyCastleProvider
scala> val a = new BouncyCastleProvider
a: org.bouncycastle.jce.provider.BouncyCastleProvider = BC version 1.46
But when I try to import the same package in Eclipse I get an error:
import org.bouncycastle.jce.provider.BouncyCastleProvider
// object bouncycastle is not a member of package org
Why is this happening?

Have you tried running sbt eclipse? That should create Eclipse project files, .classpath among them as well, which contains paths to dependencies.
Unless you use a version of Eclipse with support for dependencies under sbt, you'll have to execute sbt eclipse every time you've changed them.

Related

NullPointerException on XML.loadFile()

I am trying to load an xml file using scala-xml_2.12-1.0.6.jar but it gives me NullPointerEexception while loading
Following is my line of code to load xml
import scala.xml.XML
val xml = XML.loadFile("sample.xml")
I have decompiled this jar and is method is present in that jar but for some reasons it is unable to find it in code.
I have Scala 2.13.1 on my system but for this project I am using scala 2.12.1 and it is mentioned in mu built.sbt
scalaVersion := "2.12.1"
I have following dependency in my built.sbt for this xml package
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6"
If I copy and paste the same code to Scala interactive shell( scala 2.13.1) I get following error
import scala.xml.XML
^
error: object xml is not a member of package scala
did you mean Nil?
Can anyone please identify what am i doing wrong?
Thanks in advance.
I'm not sure how you are loading up the Scala REPL, but as mentioned in "How to use third party libraries with Scala REPL?", you should be launching the the REPL from SBT with sbt console. Your .sbt files will also need to be in scope.
The Scala REPL independently does not deal with .sbt files. They are 2 different tools.
Alternatively you could also install Ammonite-REPL which supports Magic Imports. You will be able to import Maven dependencies with import $ivy.
Scala 2.12 comes with scala-xml and you can remove that dependency as long as you run REPL with sbt console and not your native Scala REPL which is already # Scala 2.13. Otherwise you can also switch to Scala 2.13 for your SBT project.

import not found error for sbt.build file

I am in the learning process for Scala and sbt. I have an existing project I am starting with.
On the command line under top project folder, as soon as I enter the sbt command I get:
C:\Projects\aproject\build.sbt:1: error: not found: object scalariform
import scalariform.formatter.preferences._
C:\Projects\aProject\build.sbt:2: error: object sbt is not a member of package com.typesafe
import com.typesafe.sbt.SbtScalariform
I can't find an online reference for this specific error. I assume the second error is because of the first error.
The sbt.build file has these three imports:
import scalariform.formatter.preferences._
import com.typesafe.sbt.SbtScalariform
import com.typesafe.sbt.SbtScalariform.ScalariformKeys
I have scala version 2.13.0
and sbt version 1.2.8
Add to project/plugins.sbt
addSbtPlugin("org.scalariform" % "sbt-scalariform" % "1.8.3")
and refresh the project.
https://github.com/sbt/sbt-scalariform

How to correctly import SBT Build.scala in scala-ide

As an sbt build can be written in scala and is itself a scala project, i would like to import it in scala-ide as a scala project. For example with the following code.
Build.scala
import sbt._
import Keys._
object TestBuild extends Build {
lazy val root = Project(id = "test",
base = file("."),
settings = Seq(
organization := "com.tomahna",
name := "demo",
scalaVersion := "2.11.8"))
}
plugins.sbt
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
This build works fine with sbt, however Build.scala is not compiled by eclipse, thus i get neither compilation errors nor auto-completion.
I can add the project folder to source folders but then import sbt._ and import Keys._ will fail because the eclipse project is not correctly set to provide these dependencies.
Is there a way to setup the sbt project so that it interact nicely with scala-IDE ?
From sbteclipse manual:link
If you want to get Eclipse support for the sbt build definition, e.g. for your Build.scala file, follow these steps:
If you are not using sbteclipse as as global plugin, which is the recommended way, but as a local plugin for your project, you first have to add sbteclipse as a plugin (addSbtPlugin(...)) to the build definition project, i.e. to project/project/plugins.sbt
In the sbt session, execute reload plugins
Set the name of the build definition project to something meaningful: set name := "sbt-build"
Execute eclipse and then reload return
Import the build definition project into Eclipse and add the root directory to the build path

Can't find Spark libraries when using Eclipse

I used to code Scala within the terminal, but now I'm trying
with the ScalaIDE for Eclipse.
However, I've got one big problem:
error: not found: value sc
I've tried to add those libraries
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
But then it displays:
object apache is not a member of package org
So I don't know what to do....
In my IntelliJ project my build.sbt is pretty empty:
name := "test"
version := "1.0"
scalaVersion := "2.11.7"
is it a sbt project? make sure you have eclipse plugin for scala/sbt, and import it as a sbt project.
also, add the dependency on the build.sbt
nevertheless, I prefer Intellij :)

Eclipse Scala IDE code not compiling

I downloaded eclipse scala ide from scala-ide.org site and trying to compile my first scala word count program. But its gives error "object not a member of package org" in the following import command
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
After some research I found that I need to add the jar file spark-assembly-1.0.0-hadoop2.2.0.jar to overcome this issue
But after doing lot of research I could not locate this jar. Can anyone help here ?
Install the SBT Scala build+dependency tool
Create an empty directory. Name it spark-test or whatever you want to name your project.
Put your source code in the sub-directory src/scala/main. If you have Main.scala in package scalatest, it should be src/scala/main/scalatest/Main.scala
Make a build.sbt file with the following contents
name := """sparktest"""
version := "1.0-SNAPSHOT"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.4.0"
)
Configure the SBT Eclipse plugin. Create ~/.sbt/0.13/plugins/plugins.sbt, with:
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
Generate an Eclipse project with sbt eclipse
Import your project into eclipse.
Run :)
Scala is not a simple language/env to learn. It is important you learn how scala works and then move into spark.
There are tons of material available on web. A proper learning path will be to learn
SBT > SCALA > Use Scala for Spark
The dependency that you mentioned, can be put in he sbt's build.sbt. You can also use maven, but I recommend learning sbt as way of learning scala. Once you have resolved, the dependency using SBT, your simple code should work fine. But still, I recommend doing a "hello world" first than doing a "word count" :-)
Ando to answer your question, in your SBT you should be adding following library,
libraryDependencies += "org.apache.spark" % "spark-assembly_2.10" % "1.1.1"
This was for spark assembly 1.1.1 for hadoop 2.10. I see you need a different version, you can find the proper version at
Maven Repo details for spark/hadoop
Here's the pure eclipse solutions (I had to download and setup eclipse just to answer this question)
Get Scala IDE (it comes with inbuilt Scala Compiler version 2.10.5 and 2.11.6)
Create a new project Scala Wizard > Scala Project
Right click "src" in the scale project , select "Scala Object", give it a name - I gave WordCount
Right click project > Configure > Convert to Maven Project
In the body of word count object (I named the object as WordCount) paste the text from Apache Spark Example which finally looks like
```
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object WordCount {
val sparkConf = new SparkConf().setAppName("SampleTest")
val spark = new SparkContext(sparkConf)
val textFile = spark.textFile("hdfs://...")
val counts = textFile.flatMap(line => line.split(" "))
.map(word => (word, 1))
.reduceByKey(_ + _)
counts.saveAsTextFile("hdfs://...")
}
```
6. Add following to your maven
```
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
```
Right click on "Scala Library Container", and select "Latest 2.10 bundle", click ok
Once I was done with these, there was no error message displayed on my "problems" list of Eclipse...indicating that it compiled as expected.
Obviously, this example won't run as I haven't provided enough info for it to run...but this was just to answer to the question, "how to get the code compiled".
Hope this helps.