I have a scala project using spark libraies, and it works fine most of the times(using intellij). But some times it starts giving errors on intellij launch:
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] Total time: 1 s, completed 17 Sep 2022, 15:13:49
[info] shutting down sbt server
build.sbt is:
/*ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.8"
lazy val root = (project in file("."))
.settings(
name := "spark-learning"
)*/
// Name of the package
name := "spark-learning"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.0-preview2",
"org.apache.spark" %% "spark-sql" % "3.0.0-preview2"
)
What causes these issues out of sudden?And how can I get rid?
One weird thing in the error messages is the path where it looks for dependencies: ...ivy2\localorg.apache.spark\....
There should be a \ after local.
Any chance your SBT repositories configuration could be messed up? Not sure where it is on Windows, it's /etc/sbt/repositories typically on Linux but maybe Intellij has its own setting.
Related
I installed sbt-1.3.4.msi and when trying to build a sample SparkPi.scala app, I'm getting the following error:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
The SparkPi.scala file is in C:\myapps\sbt\sparksample\project\src\main\scala (as shown in the error messages above).
What am I missing here?
The C:\myapps\sbt\sparksample\sparksample.sbt file is as follows:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file
That's the problem. You've got the Scala file(s) under project directory that's owned by sbt itself (not your sbt-managed Scala project).
Move the SparkPi.scala and other Scala files to C:\myapps\sbt\sparksample\src\main\scala.
I am trying to integrate ignite in scala code and run the application using sbt. I cannot use any IDE for this.
Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3
I have tried adding the basic library dependency in sbt.build,
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
my complete sbt.build code is,
scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
but it is still not working, I am getting the following error:
$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$34(CoursierDependencyResolution.scala:214)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2946)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM
I am working on Ubuntu:latest docker image.
I am sure that spark and ignite are working as I am also running pyspark and it is working just fine. Please help me out, I think being a newbie I am doing some minor mistake which is becoming a major issue here.
It seems that he tries to pool "ignite_spark_2.11", not only "ignite_spark".
"not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom"
use
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"
instead of
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
%% -> get dependencies with scala version appended
% -> get dependencies without scala version appended
I'm trying to install Kafka in my sbt, but when I click on "import changes" I'm getting an error:
[error] stack trace is suppressed; run 'last update' for the full
output [error] stack trace is suppressed; run 'last
ssExtractDependencies' for the full output [error] (update)
sbt.librarymanagement.ResolveException: Error downloading
net.cakesolutions:scala-kafka-client_2.13:2.3.1 [error] Not found
[error] Not found [error] not found:
C:\Users\macca.ivy2\local\net.cakesolutions\scala-kafka-client_2.13\2.3.1\ivys\ivy.xml
[error] not found:
https://repo1.maven.org/maven2/net/cakesolutions/scala-kafka-client_2.13/2.3.1/scala-kafka-client_2.13-2.3.1.pom
[error] (ssExtractDependencies)
sbt.librarymanagement.ResolveException: Error downloading
net.cakesolutions:scala-kafka-client_2.13:2.3.1 [error] Not found
[error] Not found [error] not found:
C:\Users\macca.ivy2\local\net.cakesolutions\scala-kafka-client_2.13\2.3.1\ivys\ivy.xml
[error] not found:
https://repo1.maven.org/maven2/net/cakesolutions/scala-kafka-client_2.13/2.3.1/scala-kafka-client_2.13-2.3.1.pom
[error] Total time: 1 s, completed 19:56:34 26/04/2020 [info] shutting
down sbt server
build.sbt:
name := "KafkaProducer"
version := "0.1"
scalaVersion := "2.13.0"
libraryDependencies ++= Seq(
"io.circe" %% "circe-parser" % "0.12.3",
"net.cakesolutions" %% "scala-kafka-client" % "2.3.1"
)
Per the github page for scala-kafka-client, you'll need to add a bintray resolver to your build.sbt:
resolvers += Resolver.bintrayRepo("cakesolutions", "maven")
As of today Scala is still not binary compatible between versions and has tendency to serious breaking changes between "minor" (2.10 -> 2.11 -> 2.12 -> 2.13) releases.
It leads to situation where maintainers are relatively slow in adopting new versions.
e.g. Apache Spark barely started supporting 2.12 in the last stable version.
And even to the point where it is a default one.
So if I want to run this with 2.13 I have three options:
sbt publish-local
Using standard Java client instead
Nagging maintainer of Scala package to publish artificats
But I've decided to solve it by just downgrading Scala to 2.12
In PROJECT/plugins.sbt:
addSbtPlugin("com.typesafe.sbteclipse" % "sbtsclipse-plugin" % "4.0.0")
In PROJECT/build.sbt:
name := "FileSearcher"
version := "0.1"
scalaVersion := "2.11.8"
sbt version : 0.13.1
Below is an error I am getting:
[error] (*:update) sbt.ResolveException: unresolved dependency:
com.typesafe.sbteclipse#sbtsclipse-plugin;4.0.0: not found
[error] Could not create Eclipse project files:
[error] Error evaluating task 'scalacOptions': error
[error] Error evaluating task 'externalDependencyClasspath': error
The error was due to a typo:
addSbtPlugin("com.typesafe.**sbteclipse**" % "**sbtsclipse-plugin**" % "4.0.0")
I'm trying to execute JUnit test
sbt test
sbt clean test
Test fails with error :
[info] Updating {file:/D:/sbt_projects/hello-word-ec/}hello-word-ec...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 2 Scala sources to D:\sbt_projects\hello-word-ec\target\scala-2.10\test-classes...
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test1.scala:1: object junit is not a member of package org
[error] import org.junit.Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test1.scala:3: not found: type Test
[error] #Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:1: object junit is not a member of package org
[error] import org.junit.Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:2: object junit is not a member of package org
[error] import org.junit.BeforeClass
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:6: not found: type BeforeClass
[error] #BeforeClass
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:12: not found: type Test
[error] #Test
[error] ^
[error] 6 errors found
[error] (test:compileIncremental) Compilation failed
[error] Total time: 1 s, completed Jul 28, 2016 6:10:04 PM
My build.sbt file content
name := "sbt junit test project"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "com.novocode" % "junit-interface" % "0.8" % "test->default"
libraryDependencies += "junit" % "junit" % "4.12" % Test
EclipseKeys.withSource := true
file %USERDIR%.sbt\0.13\plugins\build.sbt looks like
resolvers += Classpaths.typesafeResolver
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
Put build.sbt in root project folder otherwise sbt clean cannot see the build.sbt and download dependencies . Originally build.sbt was located in <project root>\project\ folder