I'm developing a Spark process in Scala (Eclipse IDE) and runs fine in my local cluster, but when I try to compiled it with SBT that I installed on my pc I got a error (see picture).
My first doubt is why SBT try to compile with scala 2.12 if I explicitly set scalaVersion to 2.11.11 in my build.sbt. I tried installing other SBT versions with the same results, also in other PCs but not works. I need help to fix it.
scala_version(Spark) :2.11.11
sbt_version : 1.0.2
spark: 2.2
build.sbt
name := "Comple"
version := "1.0"
organization := "com.antonio.spark"
scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
)
assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "1.0.2")
Error:
ResolveException: unresolved dependency: sbt_assembly;1.0.2: not found
Try changing your assembly.sbt file to:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
as stated in the documentation here: https://github.com/sbt/sbt-assembly
I recently used that with spark-core_2.11 version 2.2.0 and it worked.
Related
I'm trying to run a Scala project from here involving Azure Event hub in a Cloudera VM installed locally with a single node. I'm using CDH 5.10. I built the jar file using sbt 0.13.15 which uses Openjdk 1.8.0. Also Oracle Jdk 1.8 is installed in my VM which is being used by spark2 while running jar file I think. The VM didn't have spark2 initially. I upgraded it using Cloudera Manager 5.11.
I'm getting the following error after the project is run:
java.lang.UnsupportedClassVersionError: com/microsoft/azure/eventhubs/EventData : Unsupported major.minor version 52.0
The error displayed in the console after the jobs are submitted I think and then the code kind of hangs.
I enforced the jvm version to be 1.8 while building the jar. My complete build.sbt-
name := "AzureGeoLogProject"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.scala-lang" % "scala-library" % "2.11.8"
libraryDependencies += "com.microsoft.azure" % "spark-streaming-eventhubs_2.11" % "2.0.3"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.0.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.2"
libraryDependencies += "org.apache.httpcomponents" % "httpclient" % "4.2.5"
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
scalacOptions += "-target:jvm-1.8"
I googled the error but got nothing. Don't know how to proceed from here. Any suggestion would be greatly appreciated.
sudo alternatives --config java
When prompted, choose java(jre) 1.8 and try again
I am trying to setup IntelliJ for spark 2.11 but it is very daunting and after days I have not been able to compile a simple instruction such as with "spark.read.format" which is not found in main core and sql spark libraries.
I have seen a few posts on the subject but with none resolved. Does anyone have some experience with perhaps a working sample program I can start with?
Could it be that it would be easier with Eclipse?
Many thanks in advance for your answers,
EZ
build project in Intellij using with scala 2.11 and sbt 0.13: then ensure that your plugins.sbt contains as below:
logLevel := Level.Warn
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
then your build.sbt must contain as below:
scalaVersion := "2.11.8"
val sparkVersion = "2.1.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion %"provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion %"provided"
Then write your code, click Terminal in Intellij and type sbt assembly: you can ship that jar to remote cluster, otherwise run from Intelij locally, let me know how it goes
I made a following dependency in build.sbt file for apache-spark 2.11.
name := "Project1"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-compiler" % "2.11.8",
"org.scala-lang" % "scala-reflect" % "2.11.8",
"org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4",
"org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4"
)
However Intellij could not resolve spark-core_2.11 dependency . I tried multiple times but could not succeed. Thanks in Advance.
I had the same problem in IntelliJ 2016.3.2 with almost the same Scala/Spark versions:
name := "some-project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
To get it to work I had to manually add the spark-core jar to my project libraries, ie:
Right click on the project -> Open Module Settings
Under Project Settings -> Libraries click + and select the 'Java' option.
Browse for the jar. I found it in my Ivy cache - I assume it got there because I had run the 'update' task from the sbt console previously.
I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol".
I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:
name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"
I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same...
I also tried to import the pom.xml file of my spark dependency but that also doesn't work.
Thank you in advance!
This worked for me->
name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)
I use
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
in my build.sbt and it works for me.
I had a similar problem. It seems the reason was that the build.sbt file was specifying the wrong version of scala.
If you run spark-shell it'll say at some point the scala version used by Spark, e.g.
Using Scala version 2.11.8
Then I edited the line in the build.sbt file to point to that version and it worked.
Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.
It worked for me when I updated the scala version of my project like below:
ThisBuild / scalaVersion := "2.11.12"
and I updated my dependency like:
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",
If you use "%%", sbt will add your project’s binary Scala version to the artifact name.
From sbt run:
sbt> reload
sbt> compile
Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"
note that you need to change spark_parent to spark_core
name := "SparkLearning"
version := "0.1"
scalaVersion := "2.12.3"
// additional libraries
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
I am using sbt to build my scala project.
This is my build.sbt:
name := "My Spark App"
version := "1.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.2.0" % "provided"
I am running sbt assembly to create an assembly jar, but I found a scala directory containing scala library class codes.
Is it possible to take scala library as a provided dependency, since the run-time environment already contains scala?
From docs, this might help
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)