I've been getting the below error quite a lot recently. It happens when I try to add library dependencies to sbt. In the below instance I tried to add
"org.tpolecat" %% "doobie-postgres" % "0.8.8",
"org.tpolecat" %% "doobie-postgres-circe" % "0.8.8",
I'm using Java 1.8 SDK and scala-sdk-2.13.3
I recently got a new macbook and downloaded IntelliJ perhaps I haven't set it up right. I also tried changing the version number of all doobie dependencies to 0.8.8 but that didn't help either.
name := "job-board"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= {
lazy val doobieVersion = "0.5.4"
lazy val http4sVersion = "0.20.8"
lazy val circeVersion = "0.9.1"
Seq(
"org.tpolecat" %% "doobie-postgres" % "0.8.8",
"org.tpolecat" %% "doobie-postgres-circe" % "0.8.8",
"org.tpolecat" %% "doobie-core" % doobieVersion,
"org.tpolecat" %% "doobie-h2" % doobieVersion,
"org.tpolecat" %% "doobie-hikari" % doobieVersion,
"org.tpolecat" %% "doobie-specs2" % doobieVersion,
"org.http4s" %% "http4s-blaze-server" % http4sVersion,
"org.http4s" %% "http4s-circe" % http4sVersion,
"org.http4s" %% "http4s-dsl" % http4sVersion,
"io.circe" %% "circe-core" % circeVersion,
"io.circe" %% "circe-generic" % circeVersion,
"io.circe" %% "circe-config" % "0.6.1",
"mysql" % "mysql-connector-java" % "5.1.34",
"org.slf4j" % "slf4j-api" % "1.7.5",
"ch.qos.logback" % "logback-classic" % "1.0.9"
)
}
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /Users/ryanmcavoy/Library/Application Support/JetBrains/Toolbox/apps/IDEA-C/ch-0/202.6397.94/IntelliJ IDEA CE.app.plugins/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.3.13 (AdoptOpenJDK Java 1.8.0_262)
[info] loading global plugins from /Users/ryanmcavoy/.sbt/1.0/plugins
[info] loading project definition from /Users/ryanmcavoy/Code/job-board/project
[info] loading settings for project job-board from build.sbt ...
[info] set current project to job-board (in build file:/Users/ryanmcavoy/Code/job-board/)
[warn] sbt server could not start because there's another instance of sbt running on this build.
[warn] Running multiple instances is unsupported
sbt:job-board>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to job-board (in build file:/Users/ryanmcavoy/Code/job-board/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /Users/ryanmcavoy/Library/Application Support/JetBrains/Toolbox/apps/IDEA-C/ch-0/202.6397.94/IntelliJ IDEA CE.app.plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to job-board (in build file:/Users/ryanmcavoy/Code/job-board/)
[warn] insecure HTTP request is deprecated 'http://repo.typesafe.com/typesafe/releases/'; switch to HTTPS or opt-in as ("Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/").withAllowInsecureProtocol(true)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.tpolecat:doobie-postgres_2.11:0.8.8
[error] Not found
[error] Not found
[error] not found: /Users/ryanmcavoy/.ivy2/local/org.tpolecat/doobie-postgres_2.11/0.8.8/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/tpolecat/doobie-postgres_2.11/0.8.8/doobie-postgres_2.11-0.8.8.pom
[error] download error: Caught java.net.ConnectException: Connection refused (Connection refused) (Connection refused (Connection refused)) while downloading http://repo.typesafe.com/typesafe/releases/org/tpolecat/doobie-postgres_2.11/0.8.8/doobie-postgres_2.11-0.8.8.pom
[error] Error downloading org.tpolecat:doobie-postgres-circe_2.11:0.8.8
[error] Not found
[error] Not found
[error] not found: /Users/ryanmcavoy/.ivy2/local/org.tpolecat/doobie-postgres-circe_2.11/0.8.8/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/tpolecat/doobie-postgres-circe_2.11/0.8.8/doobie-postgres-circe_2.11-0.8.8.pom
[error] download error: Caught java.net.ConnectException: Connection refused (Connection refused) (Connection refused (Connection refused)) while downloading http://repo.typesafe.com/typesafe/releases/org/tpolecat/doobie-postgres-circe_2.11/0.8.8/doobie-postgres-circe_2.11-0.8.8.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.tpolecat:doobie-postgres_2.11:0.8.8
[error] Not found
[error] Not found
[error] not found: /Users/ryanmcavoy/.ivy2/local/org.tpolecat/doobie-postgres_2.11/0.8.8/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/tpolecat/doobie-postgres_2.11/0.8.8/doobie-postgres_2.11-0.8.8.pom
[error] download error: Caught java.net.ConnectException: Connection refused (Connection refused) (Connection refused (Connection refused)) while downloading http://repo.typesafe.com/typesafe/releases/org/tpolecat/doobie-postgres_2.11/0.8.8/doobie-postgres_2.11-0.8.8.pom
[error] Error downloading org.tpolecat:doobie-postgres-circe_2.11:0.8.8
[error] Not found
[error] Not found
[error] not found: /Users/ryanmcavoy/.ivy2/local/org.tpolecat/doobie-postgres-circe_2.11/0.8.8/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/tpolecat/doobie-postgres-circe_2.11/0.8.8/doobie-postgres-circe_2.11-0.8.8.pom
[error] download error: Caught java.net.ConnectException: Connection refused (Connection refused) (Connection refused (Connection refused)) while downloading http://repo.typesafe.com/typesafe/releases/org/tpolecat/doobie-postgres-circe_2.11/0.8.8/doobie-postgres-circe_2.11-0.8.8.pom
[error] Total time: 1 s, completed Aug 10, 2020 1:48:01 PM
I'm using Java 1.8 SDK and scala-sdk-2.13.3
You are not using Scala 2.13.3 because it's written
scalaVersion := "2.11.7"
There is no doobie-postgres 0.8.8 for Scala 2.11
https://mvnrepository.com/artifact/org.tpolecat/doobie-postgres
Related
I have have newly installed and created spark, scala, SBT development environment in intellij but when i am trying to compile SBT, getting unresolved dependencies error.
below is my SBT file
name := "xxxxxxxxxxxxxxxxxxxx"
version := "0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.3.1"
val jacksonCore = "2.6.7"
val publishMavenStyle = true
resolvers ++= Seq(
"Artifactory" at "https://binrepo.xxxxxx.com/artifactory/xyz/",
"Artifactory Common" at "https://binrepo.xxxxxx.com/artifactory/data-engineering-gscl-abc/",
//"ArtifactorySnapShots" at "https://binrepo.xxxxxx.com/artifactory/xyz/SNAPSHOTS/"
("Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven").withAllowInsecureProtocol(true)
)
libraryDependencies ++= Seq(
"org.scalamock" %% "scalamock" % "4.4.0" % Test,
//dependency of xyz-core.
"com.tgt.dsc.xyz.datapipeline" % "xyz-core_2.11" % "1.8.1",
//dependency of fields performance specific common function
"abc-common" % "abc-common_2.11" % "3.0.0",
//dependency of reading configuration
"com.typesafe" % "config" % "1.3.3",
//spark core libraries, in the production or for spark-submit in local add provided so that dependent jar is not part of assembly jar
// ex :"org.apache.spark" %% "spark-core" % sparkVersion % provided,
"org.apache.spark" %% "spark-core" % sparkVersion % Provided ,
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided ,
"org.apache.spark" %% "spark-hive" % sparkVersion % Provided ,
"org.scala-lang" % "scala-library" % scalaVersion.value,
//logging library
"org.slf4j" % "slf4j-api" % "1.7.29",
//for doing testing
"org.scalatest" %% "scalatest" % "3.1.0" % Test,
"MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
"mrpowers" % "spark-daria" % "0.35.0-s_2.11"
)
enablePlugins(GitVersioning)
assemblyJarName in assembly := s"${name.value}_${scalaVersion.value}-${version.value}.jar"
assemblyMergeStrategy in assembly := {
//case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case x => MergeStrategy.first
}
//get the token from secrets
val token = sys.env.getOrElse("SONAR_TOKEN", "")
//configurations for sonar integration
sonarProperties ++= Map(
"sonar.host.url" -> "http://sonarqube.xxxxxx.com:9000",
"sonar.scala.version" -> "2.11",
"sonar.projectName" -> "xyz-starter",
"sonar.projectKey" -> "xyz-starter",
"sonar.sources" -> "src/main/scala",
"sonar.tests" -> "src/test/scala",
"sonar.scala.coverage.reportPaths" -> "xxxxxx/scala-2.11/coverage-report/cobertura.xml,xxxxxx/scala-2.11/scapegoat-report/scapegoat.xml",
"sonar.login" -> token,
"sonar.buildbreaker.skip" -> "false"
)
//how much code coverage is needed
coverageMinimum := 80
//fail the build if code coverage is not met
coverageFailOnMinimum := true
Entire sbt file is showing in red including the name, version, scalaVersion
when i compile following is the error which i am getting now
/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.4.5 (AdoptOpenJDK Java 11.0.9)
[info] loading global plugins from /Users/xxxxxxx/.sbt/1.0/plugins
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx-build from assembly.sbt ...
[info] loading project definition from /Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/project
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx from build.sbt ...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] sbt server started at local:///Users/xxxxxxx/.sbt/1.0/server/5231834612cf96406db7/sock
[info] started sbt server
sbt:xxxxxxxxxxxxxxxxxxxxxxxxxx>
;set _root_.scala.collection.Seq(historyPath := None,shellPrompt := { _ => "" }
,SettingKey[_root_.scala.Option[_root_.sbt.File]]("sbtStructureOutputFile")
in _root_.sbt.Global := _root_.scala.Some(_root_.sbt.file("/private/var/folders/6p/qvsthwj11q38nxlpn72hv6fr0000gq/T/sbt-structure.xml"))
,SettingKey[_root_.java.lang.String]
("sbtStructureOptions")
in _root_.sbt.Global := "download, resolveClassifiers")
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] Total time: 7 s, completed 19-May-2021, 6:44:44 PM
[info] shutting down sbt server
Any idea on how to resolve it.
Adding the Screenshot of how my console looks
Entire sbt file is showing in red including the name, version, scalaVersion
This is likely caused by some missing configuration in IntelliJ, you should have some kind of popup that aks you to "configure Scala SDK". If not, you can go to your module settings and add the Scala SDK.
when i compile following is the error which i am getting now
If you look closely to the error, you should notice this message:
Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
The dependencies you are looking for ("MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11" and "mrpowers" % "spark-daria" % "0.35.0-s_2.11") are only available in this repository http://dl.bintray.com/spark-packages/maven/ which seems to require some authentication as the HTTP error code 403 suggests.
Either you can configure authentication, or you can use more recent versions of these libraries that are published on the public Maven central repository:
"com.github.mrpowers" %% "spark-daria" % "0.39.0"
"com.github.mrpowers" %% "spark-fast-tests" % "0.23.0"
EDIT: how did I find that? I used https://mvnrepository.com/search?q=spark-daria to search for your dependencies and found the new ones with the Repository Central flag
Also note there are few things that you may want to change in your build.sbt:
Use the "sbt scheme" to use Scala dependencies without manually setting the Scala version:
use "com.github.mrpowers" %% "spark-daria" % "0.39.0"
instead of "com.github.mrpowers" % "spark-daria" % "0.39.0_2.11" (notice the double %% and the absence of suffix _2.11)
Do not declare the Scala library dependency "org.scala-lang" % "scala-library" % scalaVersion.value, this is implicitly provided
The Issue was related to the missing dependency i.e.
"MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
"mrpowers" % "spark-daria" % "0.35.0-s_2.11"
After removing this from the code I found that it was also being used in the other dependent JAR
//dependency of fields performance specific common function
"abc-common" % "abc-common_2.11" % "3.0.0",
Removing from there solved the issue.
I am new to scala and I am trying to import the following libraries in my build.sbt. When IntelliJ does an auto-update I get the following error:
Error while importing sbt project:
List([info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_251)
[info] loading global plugins from C:\Users\diego\.sbt\1.0\plugins
[info] loading project definition from C:\Users\diego\development\Meetup\Stream-Processing\project
[info] loading settings for project stream-processing from build.sbt ...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] sbt server started at local:sbt-server-80d70f9339b81b4d026a
sbt:Stream-Processing>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from C:/Users/diego/.IntelliJIdea2019.3/config/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] Total time: 2 s, completed Jun 28, 2020 12:11:24 PM
[info] shutting down sbt server)
This is my build.sbt file:
name := "Stream-Processing"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql-kafka-0-10_2.12
libraryDependencies += "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" %% "kafka-clients" % "2.3.1"
// https://mvnrepository.com/artifact/mysql/mysql-connector-java
libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.18"
// https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1"
I made a Scala project just to make sure Spark works and my python project using Kafka works as well so I am sure it's not a spark/kafka problem. Any reason why I am getting that error?
Try removing one % before "kafka-clients":
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.3.1"
The semantics of %% in SBT is that it appends the Scala version being used to the artifact name, so it becomes org.apache.kafka:kafka-clients_2.11:2.3.1 as the error message shows as well. Note the _2.11 suffix.
This is a nice shorthand for Scala libraries, but can get confusing for beginners, when used with Java libs.
I am new to Scala so any help would be really appreciated.
I am using IntelliJ IDEA Version: 2020.1, sbt.version=1.2.8, jdk1.8.0_ 251 and Scala 2.12.8.
When I tried to compile a coursera project i get the following error
compile
[error] stack trace is suppressed; run 'last coursierResolutions' for the full output
[error] (coursierResolutions) java.lang.NoSuchMethodError: lmcoursier.definitions.ToCoursier$.project(Llmcoursier/definitions/Project;)Lcoursier/core/Project;
[error] Total time: 0 s, completed 23-Apr-2020 23:46:04
[IJ]sbt:bigdata-wikipedia> last coursierResolutions
[error] java.lang.NoSuchMethodError: lmcoursier.definitions.ToCoursier$.project(Llmcoursier/definitions/Project;)Lcoursier/core/Project;
[error] at coursier.sbtcoursier.ResolutionTasks$.$anonfun$resolutionsTask$3(ResolutionTasks.scala:43)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
This is the sbt coursier plugin in plugins.sbt
addSbtPlugin("io.get-coursier" % "sbt-coursier" % "2.0.0-RC3-5")
addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.9.7")
Removing the coursier plugin and upgrading to sbt 1.3.13 worked for me.
build.properties
sbt.version=1.3.13
plugins.sbt
//addSbtPlugin("io.get-coursier" % "sbt-coursier" % "2.0.0-RC3-5")
addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.9.7")
build.sbt
course := "bigdata"
assignment := "wikipedia"
build.sbt
scalaVersion := "2.12.11"
scalacOptions ++= Seq("-language:implicitConversions", "-deprecation")
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % Test,
("org.apache.spark" %% "spark-core" % "2.4.3"),
("org.apache.spark" %% "spark-sql" % "2.4.3")
)
dependencyOverrides ++= Seq(
("com.fasterxml.jackson.core" % "jackson-databind" % "2.6.7")
)
testOptions in Test += Tests.Argument(TestFrameworks.JUnit, "-a", "-v", "-s")
Credit to #JOHN for this answer.
I used the sbt binary in terminal directly with version 1.2.8 and it saved me(the sbt version of sbt shell in IDEA was 1.3.8).
I was experiencing the same error while taking the course Functional Programming Principles in Scala by École Polytechnique Fédérale de Lausanne on Coursera (I assume you were taking that course too). For me the solution was using the following settings, where my changes were in the "sbt projects" section at the bottom:
After that, see how from sbt shell everything worked correctly for me:
"C:\Program Files\AdoptOpenJDK\jdk-11.0.9.11-hotspot\bin\java.exe" -agentlib:jdwp=transport=dt_socket,address=localhost:63820,suspend=n,server=y -Xdebug -server -Xmx1536M -Dsbt.supershell=false -Didea.managed=true -Dfile.encoding=UTF-8 -Dsbt.log.noformat=true -jar C:\Users\jaime\AppData\Roaming\JetBrains\IdeaIC2020.2\plugins\Scala\launcher\sbt-launch.jar early(addPluginSbtFile=\"\"\"C:\Users\jaime\AppData\Local\Temp\idea1.sbt\"\"\") "; set ideaPort in Global := 63650 ; idea-shell"
Listening for transport dt_socket at address: 63820
[info] Loading settings for project global-plugins from idea1.sbt ...
[info] Loading global plugins from C:\Users\jaime\.sbt\1.0\plugins
[info] Updating ProjectRef(uri("file:/C:/Users/jaime/.sbt/1.0/plugins/"), "global-plugins")...
[info] downloading https://repo1.maven.org/maven2/io/github/sugakandrey/scala-compiler-indices-protocol_2.12/0.1.1/scala-compiler-indices-protocol_2.12-0.1.1.jar ...
[info] downloading https://repo1.maven.org/maven2/io/spray/spray-json_2.12/1.3.4/spray-json_2.12-1.3.4.jar ...
[info] downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar ...
[info] [SUCCESSFUL ] io.github.sugakandrey#scala-compiler-indices-protocol_2.12;0.1.1!scala-compiler-indices-protocol_2.12.jar (380ms)
[info] [SUCCESSFUL ] io.spray#spray-json_2.12;1.3.4!spray-json_2.12.jar(bundle) (444ms)
[info] downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.jetbrains/sbt-idea-shell/scala_2.12/sbt_1.0/2018.3/jars/sbt-idea-shell.jar ...
[info] downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.jetbrains/sbt-idea-compiler-indices/scala_2.12/sbt_1.0/0.1.3/jars/sbt-idea-compiler-indices.jar ...
[info] [SUCCESSFUL ] org.jetbrains#sbt-structure-extractor;2018.2.1+4-88400d3f!sbt-structure-extractor.jar (1711ms)
[info] [SUCCESSFUL ] org.jetbrains#sbt-idea-shell;2018.3!sbt-idea-shell.jar (1790ms)
[info] [SUCCESSFUL ] org.jetbrains#sbt-idea-compiler-indices;0.1.3!sbt-idea-compiler-indices.jar (2273ms)
[info] Done updating.
[info] Loading settings for project example-build from buildSettings.sbt,plugins.sbt ...
[info] Loading project definition from C:\Users\jaime\Downloads\example\project
[info] Updating ProjectRef(uri("file:/C:/Users/jaime/Downloads/example/project/"), "example-build")...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Compiling 2 Scala sources to C:\Users\jaime\Downloads\example\project\target\scala-2.12\sbt-1.0\classes ...
[info] Done compiling.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.protobuf.UnsafeUtil (file:/C:/Users/jaime/.sbt/boot/scala-2.12.7/org.scala-sbt/sbt/1.2.8/protobuf-java-3.3.1.jar) to field java.nio.Buffer.address
WARNING: Please consider reporting this to the maintainers of com.google.protobuf.UnsafeUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[info] Loading settings for project example from assignment.sbt,build.sbt ...
[info] Set current project to progfun1-example (in build file:/C:/Users/jaime/Downloads/example/)
[info] Defining Global / ideaPort
[info] The new value will be used by Compile / compile, Test / compile
[info] Reapplying settings...
[info] Set current project to progfun1-example (in build file:/C:/Users/jaime/Downloads/example/)
[IJ]sbt:progfun1-example> console
[info] Compiling 1 Scala source to C:\Users\jaime\Downloads\example\target\scala-2.13\classes ...
[info] Non-compiled module 'compiler-bridge_2.13' for Scala 2.13.0. Compiling...
[info] Compilation completed in 7.345s.
[info] Done compiling.
[info] Starting scala interpreter...
Welcome to Scala 2.13.0 (OpenJDK 64-Bit Server VM, Java 11.0.9).
Type in expressions for evaluation. Or try :help.
scala>
Yes, that solution worked for me partially, but I also had to change the build.sbt as following:
course := "bigdata"
assignment := "wikipedia"
scalaVersion := "2.12.8"
scalacOptions ++= Seq("-language:implicitConversions", "-deprecation")
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % Test,
("org.apache.spark" %% "spark-core" % "2.4.3"),
("org.apache.spark" %% "spark-sql" % "2.4.3")
)
dependencyOverrides ++= Seq(
("com.fasterxml.jackson.core" % "jackson-databind" % "2.6.7")
)
testOptions in Test += Tests.Argument(TestFrameworks.JUnit, "-a", "-v", "-s")
That worked for relatively new Scala version and recent Spark libraries built for it.
I'm using following plugins & Versions
scalaVersion := "2.11.6"
sbt.version=1.2.7
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.7.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "5.0.1")
I'm getting following stack trace:
java.lang.ArrayIndexOutOfBoundsException: 65791
[error] at scala.tools.asm.ClassWriter.findItemByIndex(ClassWriter.java:1755)
[error] at scala.tools.asm.MethodWriter.getSize(MethodWriter.java:2045)
[error] at scala.tools.asm.ClassWriter.toByteArray(ClassWriter.java:827)
[error] at scala.tools.nsc.backend.jvm.GenASM$JBuilder.writeIfNotTooBig(GenASM.scala:529)
[error] at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1343)
[error] at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.emitFor$1(GenASM.scala:197)
[error] at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:203)
[error] at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1500)
[error] at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1487)
[error] at scala.tools.nsc.Global$Run.compileSources(Global.scala:1482)
[error] at scala.tools.nsc.Global$Run.compile(Global.scala:1580)
[error] at xsbt.CachedCompiler0.run(CompilerInterface.scala:130)...
scalaVersion := "2.11.6"
sbt.version=1.2.7
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.7.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "5.0.1")
sbt-play-ebean 5.0.1 has Scala Version 2.13.0-M5,2.12, and 2.11 supported.
So maybe you can try to use Java 8 Version because java 9 and up has a lot of incompatibilities.
Hope it helps.
I have a Spark Ignite program. When I use sbt to build the program, it reports error.
This is a very simple program, which is an empty project with just a sbt file.
build.sbt:
name := "testsss"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0"
libraryDependencies += "org.apache.ignite" % "ignite-core" % "1.9.0"
libraryDependencies += "org.apache.ignite" % "ignite-indexing" % "1.9.0"
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "1.9.0"
The error is
Error:Error while importing SBT project:...[info]
Resolving org.scala-sbt#testing;0.13.8 ... [info] Resolving
org.scala-sbt#test-agent;0.13.8 ... [info] Resolving
org.scala-sbt#test-interface;1.0 ... [info] Resolving
org.scala-sbt#main-settings;0.13.8 ... [info] Resolving
org.scala-sbt#apply-macro;0.13.8 ... [info] Resolving
org.scala-sbt#command;0.13.8 ... [info] Resolving
org.scala-sbt#logic;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_8_2;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_9_2;0.13.8 ... [info] Resolving
org.scala-sbt#precompiled-2_9_3;0.13.8 ... [error] Modules were
resolved with conflicting cross-version suffixes in
{file:/Users/lzhan71/Documents/IgniteMaster/}ignitemaster: [error]
org.scalatest:scalatest _2.11, _2.10 [error] com.twitter:chill
_2.11, _2.10 [error] org.apache.spark:spark-unsafe _2.11, _2.10 [error] org.apache.spark:spark-tags _2.11, _2.10 [trace] Stack
trace suppressed: run 'last *:update' for the full output. [trace]
Stack trace suppressed: run 'last :ssExtractDependencies' for the
full output. [error] (:update) Conflicting cross-version suffixes in:
org.scalatest:scalatest, com.twitter:chill,
org.apache.spark:spark-unsafe, org.apache.spark:spark-tags [error]
(*:ssExtractDependencies) Conflicting cross-version suffixes in:
org.scalatest:scalatest, com.twitter:chill,
org.apache.spark:spark-unsafe, org.apache.spark:spark-tags [error]
Total time: 17 s, completed Apr 28, 2017 4:41:00 PMSee
complete log in /Users/lzhan71/Library/Logs/IdeaIC2016.2/sbt.last.log
I checked. It seems that `"ignite-spark" % "1.9.0" used a different version of those conflict dependence with spark-core and spark-sql. How could I solve this?
This seems to be a bug in ignite-spark. It depends simultaneously on spark-unsafe_2.10 and scala-library 2.11.8, which should never happen. Report it and wait until a new version is released or workaround by getting its source code, fixing the spark-unsafe_2.10 dependency and recompiling (quite likely no source code fixes are needed).
You can also try fixing it like this:
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.2.0" excludeAll(ExclusionRule("org.apache.spark"))