Error: not found: value SparkSession - scala

I just followed the getting started page of Spark and tried to run the simpleApp.
My Spark version : 2.3
Scala version in REPL: 2.11.8
This is the build.sbt file:
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
When I run sbt package, First I get some of conflicts warnings. Then there's this error
not found: value SparkSession
[error] val spark = SparkSession.builder.appName("Simple Application").getOrCreate()
What could I be missing? In the target folder, it shows scala 2.12. I wonder why.

Related

Extracting structure failed: Build status: Error - spark scala

Installed intellij community edition 2022.3.1. Trying to compile a simple scala spark program and getting "Extracting structure failed: Build status: Error" error.
Below is my build.sbt
ThisBuild / version := "0.1.0-SNAPSHOT"
//ThisBuild / scalaVersion := "2.13.5"
ThisBuild / scalaVersion := "2.10.1"
lazy val root = (project in file("."))
.settings(
name := "untitled"
)
libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % "2.3.4")
Any help is most appreciated.
If you use a dependency _2.11 then your Scala version should be 2.11.x.
If your Scala version is 2.13.x (or 2.10.x) then you should use a dependency _2.13 (or _2.10 correspondingly).
https://mvnrepository.com/artifact/org.apache.spark/spark-core
"org.apache.spark" %% ... instead of "org.apache.spark" % ... automatically adds proper suffix _2.13, _2.12, _2.11, _2.10 ...
Current Spark is 3.3.1 (it exists for Scala 2.13.x or 2.12.x), current Scala is 2.13.10 (former are 2.12.17, 2.11.12, 2.10.7).
Former Spark 2.3.4 exists only for Scala 2.11.x, not 2.13.x or 2.10.x.
Version of IntelliJ is now irrelevant.
Try
ThisBuild / scalaVersion := "2.13.10"
libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "3.3.1")
// libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.13" % "3.3.1")

Unable to find PlayScala (Heroku tutorial)

I'm new to Scala and Heroku and I'm following the Heroku getting-started guide.
I'm using Mac (10.14.6) and followed the instructions here to install sbt and play.
I am now on Declare app dependencies but when I type the sbt compile stage command I get the following error:
$ sbt compile stage
[info] welcome to sbt 1.3.13 (AdoptOpenJDK Java 11.0.8)
[info] loading project definition from /Users/jack/scala-getting-started/project
/Users/jack/scala-getting-started/build.sbt:5: error: not found: value PlayScala
lazy val root = (project in file(".")).enablePlugins(PlayScala)
^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
The build.sbt file (provided automatically) is
name := """play-getting-started"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
jdbc,
cache,
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
ws
)
libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ )
How can I fix this error?
Thank you.

Unauthorized sbt download dependencies

Trying to download dependencies from our dependencies site (JFrog) using sbt.
However, getting unauthorized errors. My machine is using sbt 1.3.3 and scala 2.13.1
name := """backend"""
version := "1.0-SNAPSHOT"
topLevelDirectory := Some(packageName.value)
scalaVersion := "2.12.9"
val anormVersion = "2.5.3"
val silhouetteVersion = "6.0.0-RC1"
val silencerVersion = "1.3.2"
libraryDependencies ++= Seq(
"com.jayway.jsonpath" % "json-path" % "2.4.0"
)
I tried logging into JFrog in the browser with the same credentials and it works so not sure why sbt is having permission issues. I am using environment variables for sbt rather than a credentials file.

SparkSession version trouble in Scala

I've read this article and copipasted the Sort-Merge Join Example, but when I'm trying to build the project I'm getting the following error:
object SparkSession is not a member of package org.apache.spark.sql
I've seen many questions about this error, and the answers were that they used an old version of Spark. However I mentioned in build.sbt version 2.1 of Spark as they use in the example on that website.
Here is my build.sbt:
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
What am I missing?
Spark SQL dependency is missing
libraryDependencies += "org.apache.spark" %% "spark-sql" % 2.1.0

Unresolve Dependencies Spark library

I followed this link : http://blog.miz.space/tutorial/2016/08/30/how-to-integrate-spark-intellij-idea-and-scala-install-setup-ubuntu-windows-mac/
When I try to compile my project with Intellij, sbt is complaining about unresolved dependencies
[Warn] ===public: tried [Warn]
https://repol.maven.org/maven2/org/apache/spark/spark-core/2.1.1/spark-core-2.1.1.pom
[Warn] Unresolved dependencies path: org.apache.spark:spark-core:2.1.1
My scala version is 2.12.2 and sparkVersion is 2.1.1
Here is what my build.sbt look like :
name := "test" version := "1.0" scalaVersion := "2.12.2"
val sparkVersion = "2.1.1"
libraryDependencies ++= Seq("org.apache.spark" % "spark-core" & sparkVersion)`
thank you
your last line should be
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % sparkVersion
Or
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % sparkVersion)
And its better not to use scalaVersion := "2.12.2" as mentioned by #Nonontb and #Cyrelle. Please reduce the version that spark supports for better performance and to avoid unexpected errors.
the following line is from Spark docs
For the Scala API, Spark 2.1.1 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).