How to add Java dependencies to Scala projects's sbt file - scala

I have a spark streaming Scala project which uses Apache NiFi receiver. The projects runs fine under Eclipse/Scala IDE and now I want to package it for deployment now.
When I add it as
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
sbt assumes it's a Scala library and tries to resolve it.
How doe I add NiFi receiver and all it's dependencies to project's SBT file?
Also, is it possible to pint dependencies to local directories instead of sbt trying to resolve?
Thanks in advance.
Here is my sbt file contents:
name := "NiFi Spark Test"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"

libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
Double % are used for adding scala version as suffix to the maven artefact. It is required because different scala compiler versions produces incompatible bytecode. If you are would like to use java library from maven, then you should use single % character
libraryDependencies += "org.apache.nifi" % "nifi-spark-receiver" % "0.3.0"

I also found that I can put libraries the project depends on into the lib folder and they will be picked up during assembly.

Related

Scala Play HTTP and gRPC

I have an HTTP Backend with Scala Play. Works fine.
Now I want to set up a gRPC-API on top of it (theoretical this should work).
To set gRPC up I basically followed the akka-quickstart
I can run sbt compile and get my generated Scala classes in the target/../ dic.
But if I try to run sbt run I get
--- (Running the application, auto-reloading is enabled) ---
[warn] a.u.ManifestInfo - You are using version 2.6.5 of Akka, but it appears you (perhaps indirectly) also depend on older versions of related artifacts. You can solve this by adding an explicit dependency on version 2.6.5 of the [akka-discovery] artifacts to your project. See also: https://doc.akka.io/docs/akka/current/common/binary-compatibility-rules.html#mixed-versioning-is-not-allowed
[error] java.lang.IllegalStateException: You are using version 2.6.5 of Akka, but it appears you (perhaps indirectly) also depend on older versions of related artifacts. You can solve this by adding an explicit dependency on version 2.6.5 of the [akka-discovery] artifacts to your project. See also: https://doc.akka.io/docs/akka/current/common/binary-compatibility-rules.html#mixed-versioning-is-not-allowed
So I understand that some libs I use, are too old for Akka 2.6.5 but i don't understand how to set my service on a lower Akka version.
My plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.2")
addSbtPlugin("org.foundweekends.giter8" % "sbt-giter8-scaffold" % "0.11.0")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.0-M1")
resolvers += Resolver.bintrayRepo("playframework", "maven")
libraryDependencies += "com.lightbend.play" %% "play-grpc-generators" % "0.8.2"
my build.sbt
name := "smartmarkt"
version := "1.0-SNAPSHOT"
scalaVersion := "2.13.2"
lazy val root = (project in file("."))
.enablePlugins(PlayScala, PlayAkkaHttp2Support, AkkaGrpcPlugin)
import play.grpc.gen.scaladsl.PlayScalaServerCodeGenerator
akkaGrpcExtraGenerators += PlayScalaServerCodeGenerator
libraryDependencies += "com.lightbend.play" %% "play-grpc-runtime" % "0.8.2"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test
Looking at your direct dependencies:
"com.lightbend.play" %% "play-grpc-runtime" % "0.8.2" depends on akka-discovery 2.6.4.
You are using Play 2.8.2 which depends on Akka version 2.6.5.
Just add the depencency on akka-discovery 2.6.5 to your dependencies:
libraryDependencies += "com.typesafe.akka" %% "akka-discovery" % "2.6.5"

Spark with IntelliJ or Eclipse

I am trying to setup IntelliJ for spark 2.11 but it is very daunting and after days I have not been able to compile a simple instruction such as with "spark.read.format" which is not found in main core and sql spark libraries.
I have seen a few posts on the subject but with none resolved. Does anyone have some experience with perhaps a working sample program I can start with?
Could it be that it would be easier with Eclipse?
Many thanks in advance for your answers,
EZ
build project in Intellij using with scala 2.11 and sbt 0.13: then ensure that your plugins.sbt contains as below:
logLevel := Level.Warn
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
then your build.sbt must contain as below:
scalaVersion := "2.11.8"
val sparkVersion = "2.1.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion %"provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion %"provided"
Then write your code, click Terminal in Intellij and type sbt assembly: you can ship that jar to remote cluster, otherwise run from Intelij locally, let me know how it goes

IntelliJ: scalac bad symbolic reference

In my build.sbt file I have this in my project.
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
I just let it download all the libraries automatically. I'm adding graphx, the spark-core, and the scala sdk to one of my project modules but when I try to compile I'm getting:
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term hadoop
in package org.apache which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term io
in value org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term compress
in value org.apache.io which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
The weird thing is if I download graphx/mllib directly from the maven repositories it seems to compile. Any ideas?
Another possible source of error is the incorrect scalac version setting in the project. Right click project -> Open module settings -> Global Libraries, change/add the scala-sdk version appropriate to your project
Please add the hadoop dependency. Something like
libraryDependencies += "org.apache.hadoop" %% "hadoop-common" % "2.7.1"
libraryDependencies += "org.apache.hadoop" %% "hadoop-hdfs" % "2.7.1"
You may need to add other hadoop modules depending on your app.

How to add dependency files to Scala?

I'm new to Scala and Spark and and started writing a simple Apache Spark program in Scala IDE (in Eclipse). I added the dependency jar files to my project as I usually do in my java project but it can't recognize them and give me the following error message object apache is not a member of package org. How should I add the dependency jar files?
The jar files I'm adding are the ones exist under 'lib' directory where Spark in installed.
For scala you use SBT as a dependency manager and code compiler.
More information on how to set it up here:
http://www.scala-sbt.org/release/tutorial/Setup.html
However your build file will look something like this:
name := "Test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"

Can't install Scaladoc with SBT and Intellij

I am new to scala and am currently trying to setup IntelliJ IDEA 13.1 with the Scala plugin. It has support for SBT. I have simply followed the basic tutorial for creating a new project for SBT here: http://confluence.jetbrains.com/display/IntelliJIDEA/Getting+Started+with+SBT
Currently my build.sbt file is:
name := "scalasandpit"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
autoAPIMappings := true
This pulls down various jar binaries, but no sources and no javadoc. I wondered if there is a way to have both sources and javadoc work with IntelliJ and SBT. I think I'm missing something.
There seem to be two issues: getting sbt to pull down sources and docs, and then getting Idea to show them to you. To solve the former problem see the sbt documentation -- about half way down there's a section called "Download Sources" which tells you what to add to your build.sbt:
libraryDependencies +=
"org.scalatest" % "scalatest_2.10" % "2.1.0" % "test" withSources() withJavadoc()