Installed intellij community edition 2022.3.1. Trying to compile a simple scala spark program and getting "Extracting structure failed: Build status: Error" error.
Below is my build.sbt
ThisBuild / version := "0.1.0-SNAPSHOT"
//ThisBuild / scalaVersion := "2.13.5"
ThisBuild / scalaVersion := "2.10.1"
lazy val root = (project in file("."))
.settings(
name := "untitled"
)
libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % "2.3.4")
Any help is most appreciated.
If you use a dependency _2.11 then your Scala version should be 2.11.x.
If your Scala version is 2.13.x (or 2.10.x) then you should use a dependency _2.13 (or _2.10 correspondingly).
https://mvnrepository.com/artifact/org.apache.spark/spark-core
"org.apache.spark" %% ... instead of "org.apache.spark" % ... automatically adds proper suffix _2.13, _2.12, _2.11, _2.10 ...
Current Spark is 3.3.1 (it exists for Scala 2.13.x or 2.12.x), current Scala is 2.13.10 (former are 2.12.17, 2.11.12, 2.10.7).
Former Spark 2.3.4 exists only for Scala 2.11.x, not 2.13.x or 2.10.x.
Version of IntelliJ is now irrelevant.
Try
ThisBuild / scalaVersion := "2.13.10"
libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "3.3.1")
// libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.13" % "3.3.1")
Related
When I tray using spark with Scala in SBT build system to read a Json file a have the error:
and my SBT file is:
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.10"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.3.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.3.1"
// https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector
libraryDependencies += "org.mongodb.spark" % "mongo-spark-connector" % "10.0.5"
I tray to change all "3.3.1" and "10.0.5" to "3.0.1" and it still the same problem
You should use a 2.12 version of Scala, since those dependencies do not all have 2.13 built.
ThisBuild / scalaVersion := "2.12.11"
Look at the screenshot from Maven to see that 2.13 isn't listed for those versions (or at all).
Writing scala tests, I am encountering an error:
Symbol 'type org.scalatest.compatible.Assertion' is missing from the classpath.
This symbol is required by 'type org.scalatest.Assertion'.
Make sure that type Assertion is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'package.class' was compiled against an incompatible version of org.scalatest.compatible.
My build.sbt is very simple:
import sbt.Keys.libraryDependencies
ThisBuild / scalaVersion := "2.13.2"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("."))
.settings(
name := "myproj",
libraryDependencies += "org.scalactic" %% "scalactic" % "3.2.9",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.9" % "test",
libraryDependencies += "org.scalatest" %% "scalatest-funsuite" % "3.2.9" % "test",
libraryDependencies += "org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.2",
)
This problem only occurs when I build my project within IntelliJ. Using sbt itself it compiles well.
sbt version in this project: 1.3.10
sbt script version: 1.3.6
Any idea of how to fix it in IntelliJ?
Most of the problems related to IntelliJ being broken I resolve by doing so:
sbt clean compile in terminal on root folder
View -> Tool Windows -> sbt and within tab Reload SBT project (right click on project)
Hello i have a sbt file like this:
name := "Myexample"
version := "1.0"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-sql" % "2.4.4",
"org.apache.spark" %% "spark-mllib" % "2.4.4"
)
I want to fix a universal sbt file. If other computer has other scala version or spark version can i make a sbt file dynamic, not declaring my own scalaVersion at all, but declaring in every computer their own scalaversion or apache spark version??
I followed this link : http://blog.miz.space/tutorial/2016/08/30/how-to-integrate-spark-intellij-idea-and-scala-install-setup-ubuntu-windows-mac/
When I try to compile my project with Intellij, sbt is complaining about unresolved dependencies
[Warn] ===public: tried [Warn]
https://repol.maven.org/maven2/org/apache/spark/spark-core/2.1.1/spark-core-2.1.1.pom
[Warn] Unresolved dependencies path: org.apache.spark:spark-core:2.1.1
My scala version is 2.12.2 and sparkVersion is 2.1.1
Here is what my build.sbt look like :
name := "test" version := "1.0" scalaVersion := "2.12.2"
val sparkVersion = "2.1.1"
libraryDependencies ++= Seq("org.apache.spark" % "spark-core" & sparkVersion)`
thank you
your last line should be
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % sparkVersion
Or
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % sparkVersion)
And its better not to use scalaVersion := "2.12.2" as mentioned by #Nonontb and #Cyrelle. Please reduce the version that spark supports for better performance and to avoid unexpected errors.
the following line is from Spark docs
For the Scala API, Spark 2.1.1 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
I have been having a sbt dependency issue when I try to build my apache spark project. I have Apache Spark 1.3.1.
My .sbt file is this:
name := "Transaction"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"Spray Repository" at "http://repo.spray.cc/")
And I keep getting this error:
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.3.1: not found
I have looked all over and this seems to be a persistent issue, but no one has really solved it.
Thanks for your help!
I used
“org.apache.spark” % “spark-core_2.10” % “1.3.1”
instead of
“org.apache.spark” %% “spark-core” % “1.3.1”
and it worked!
EDIT:
However, I was able to get the latter statement to work after specifically making my scalaVersion 2.10 so:
scalaVersion := "2.10"
Probably because it tries to look for a specific 2.10.4 jar which doesn't exist.
I actually figured it out. You just need to put "provided"after the spark version.
name := "Transaction Fraud"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" % "provided"
resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"Spray Repository" at "http://repo.spray.cc/")