Sbt dependency error with Spark Catalyst and Scala parser library - scala

Recently, started learning and using Spark and Scala. Ran into the following dependency conflict issue
Scala version : 2.13.8
SBT version : 1.6.2
Spark version required : 3.2.1
[error] (update) found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[error]
[error] * org.scala-lang.modules:scala-parser-combinators_2.13:2.1.1 (early-semver) is selected over 1.1.2
[error] +- ch.epfl.scala:hello-world_2.13:1.0 (depends on 2.1.1)
[error] +- org.apache.spark:spark-catalyst_2.13:3.2.1 (depends on 1.1.2)
[error]

libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.3.0"
... other libraries ...
dependencyOverrides += "org.scala-lang.modules" %% "scala-parser-combinators" % "2.1.1"
Add this dependencyOverrides in your build.sbt file

Related

SBT binary incompatible issue scala-parser-combinators

I am new to the Scala env and trying to build a test project using play with 'scalikejdbc' for SQL integration. Here is my build.sbt following the documents at http://scalikejdbc.org/documentation/playframework-support.html
name := """run"""
organization := "com.example"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.13.10"
libraryDependencies ++= Seq(
guice,
"org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test,
"com.h2database" % "h2" % "1.4.200", // your jdbc driver here
"org.scalikejdbc" %% "scalikejdbc" % "4.0.0",
"org.scalikejdbc" %% "scalikejdbc-config" % "4.0.0",
"org.scalikejdbc" %% "scalikejdbc-play-initializer" % "2.8.0-scalikejdbc-3.5",
"com.typesafe.play" %% "play-ws" % "2.3.1"
)
dependencyOverrides += "org.fluentlenium" % "fluentlenium-core" % "0.10.3"
// Adds additional packages into Twirl
//TwirlKeys.templateImports += "com.example.controllers._"
// Adds additional packages into conf/routes
// play.sbt.routes.RoutesKeys.routesImport += "com.example.binders._"
I have also aded the below on plugins.sbt
resolvers += Resolver.url("SBT Plugins", url("https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
However while trying to update the application before running i am getting the below issue
[error] (update) found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[error]
[error] * org.scala-lang.modules:scala-parser-combinators_2.13:2.1.0 (early-semver) is selected over {1.1.2}
[error] +- org.scalikejdbc:scalikejdbc-core_2.13:4.0.0 (depends on 2.1.0)
[error] +- com.typesafe.play:play_2.13:2.8.18 (depends on 1.1.2)
[error] +- com.typesafe:ssl-config-core_2.13:0.4.3 (depends on 1.1.2)
[error]
following are relevant versions :
sbt script version: 1.8.0
play version :2.8.18
Scala code runner version 3.2.1 but using as scalaVersion := "2.13.10" in sbt.
version conflict(s) in library dependencies; some are suspected to be binary incompatible
This means that you have at least two dependencies that are needing a 3rd one but they need 2 different versions which are supposed to be incompatible.
As only one can be included and used for your code, SBT is telling you: "I am going to pick one version but there are risks that this version won't work as you might have code that rely on the other version and this might be incompatible".
You have 2 options:
force a version (for scala-parser-combinators) with dependencyOverrides for instance if you know that actually there's no incompatibility, or at least on the way you use the libraries
upgrade or downgrade the libraries (scalikejdbc or play) so that all libraries depend on the same version of the conflicting one (scala-parser-combinators).
In this case, I would downgrade scalikejdbc because there's no newer play version (as of today)

Playframework: cannot override sbt dependency

Here is a minimal reproducible example:
echo '
libraryDependencies += "org.apache.jena" % "apache-jena-libs" % "3.17.0" ;
lazy val testPlay = (project in file(".")) .enablePlugins(PlayScala)
' > build.sbt ;
mkdir project ;
echo 'addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.5")'
> project/plugins.sbt ;
sbt run
I have a dependency to a Java module (Apache Jena), that specifies jackson-core 2.11.3 , while latest Play is at 2.10.4 . Play does not start in dev mode:
JsonMappingException: Scala module 2.10.4 requires Jackson Databind version >= 2.10.0 and < 2.11.0.
I tried this:
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.10.4"
but it is not taken in account, checking with show fullClasspath and dependencyTree .
Also tried excludeDependencies in two ways, specifying 2.10.4 or 2.11.3 . My understanding is that
a Java dependency cannot be changed by SBT excludeDependencies
function play.runsupport.Reloader.startDevMode() is not
affected by main SBT configuration, because it is part of the Play
plugin ; indeed the class Reloader is not in play-server_2.12-2.8.5.jar
So what kind of hack could I try?
The sbt config.
https://github.com/jmvanel/semantic_forms/blob/master/scala/project/Common.scala#L44
The stack trace:
[error] com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.4 requires Jackson Databind version >= 2.10.0 and < 2.11.0
[error] at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
[error] at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
[error] at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
[error] at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:819)
[error] at akka.serialization.jackson.JacksonObjectMapperProvider$.$anonfun$configureObjectMapperModules$4(JacksonObjectMapperProvider.scala:223)
[error] at akka.serialization.jackson.JacksonObjectMapperProvider$.$anonfun$configureObjectMapperModules$4$adapted(JacksonObjectMapperProvider.scala:222)
[error] at scala.collection.immutable.List.foreach(List.scala:431)
[error] at akka.serialization.jackson.JacksonObjectMapperProvider$.configureObjectMapperModules(JacksonObjectMapperProvider.scala:222)
...
[error] at akka.actor.ActorSystem$.apply(ActorSystem.scala:290)
[error] at play.core.server.DevServerStart$.$anonfun$mainDev$1(DevServerStart.scala:248)
[error] at play.utils.Threads$.withContextClassLoader(Threads.scala:22)
[error] at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:76)
You can force a dependency with:
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.10.4" force()
Another option you have is to exclude dependency:
libraryDependencies += "org.apache.jena" % "apache-jena-libs" % "3.17.0" exclude("com.fasterxml.jackson.core", "jackson-databind")

Scala IntelliJ library import errors

I am new to scala and I am trying to import the following libraries in my build.sbt. When IntelliJ does an auto-update I get the following error:
Error while importing sbt project:
List([info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_251)
[info] loading global plugins from C:\Users\diego\.sbt\1.0\plugins
[info] loading project definition from C:\Users\diego\development\Meetup\Stream-Processing\project
[info] loading settings for project stream-processing from build.sbt ...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] sbt server started at local:sbt-server-80d70f9339b81b4d026a
sbt:Stream-Processing>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from C:/Users/diego/.IntelliJIdea2019.3/config/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] Total time: 2 s, completed Jun 28, 2020 12:11:24 PM
[info] shutting down sbt server)
This is my build.sbt file:
name := "Stream-Processing"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql-kafka-0-10_2.12
libraryDependencies += "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" %% "kafka-clients" % "2.3.1"
// https://mvnrepository.com/artifact/mysql/mysql-connector-java
libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.18"
// https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1"
I made a Scala project just to make sure Spark works and my python project using Kafka works as well so I am sure it's not a spark/kafka problem. Any reason why I am getting that error?
Try removing one % before "kafka-clients":
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.3.1"
The semantics of %% in SBT is that it appends the Scala version being used to the artifact name, so it becomes org.apache.kafka:kafka-clients_2.11:2.3.1 as the error message shows as well. Note the _2.11 suffix.
This is a nice shorthand for Scala libraries, but can get confusing for beginners, when used with Java libs.

Compiling a Scala program failing Due to Dependencies not found

I have installed Flink, Scala and sbt
Flink Version: 1.9.1
Scala Version: 2.10.6
Sbt Version: 1.3.7
I made relevant changes in build.sbt.
Compile command is failing
Here is the relevant information.
Any information is greatly appreciated
**Versions Information
[osboxes#osboxes local]$ scala -version
Scala code runner version 2.10.6 -- Copyright 2002-2013, LAMP/EPFL
[osboxes#osboxes local]$ flink --version
Version: 1.9.1, Commit ID: 4d56de8
[osboxes#osboxes readcsvfile]$ sbt -version
sbt version in this project: 1.3.7
sbt script version: 1.3.7
** build.sbt changes
val flinkVersion = "1.9.1"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided")
** Compile Errors
sbt:readCsvfile> compile
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last update for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.flink:flink-streaming-scala_2.13:1.9.1
[error] Not found
[error] Not found
[error] not found: /home/osboxes/.ivy2/local/org.apache.flink/flink-streaming-scala_2.13/1.9.1/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/flink/flink-streaming-scala_2.13/1.9.1/flink-streaming-scala_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-scala_2.13:1.9.1
[error] Not found
[error] Not found
[error] not found: /home/osboxes/.ivy2/local/org.apache.flink/flink-scala_2.13/1.9.1/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/flink/flink-scala_2.13/1.9.1/flink-scala_2.13-1.9.1.pom
[error] Total time: 4 s, completed Jan 30, 2020 3:59:12 PM
sbt:readCsvfile>
Few points I want to mention here regarding the SBT dependencies issues are:
Please add scalaVersion := "2.12.11" in build.sbt file like this, which includes the Scala version in your SBT dependencies automatically due to this%%.
name := "flink-streaming-demo"
scalaVersion := "2.12.11"
val flinkVersion = "1.10.0"
libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided"
If you want Scala version specific SBT dependencies then use % like this:
libraryDependencies += "org.apache.flink" % "flink-scala_2.12" % flinkVersion % "provided"
libraryDependencies += "org.apache.flink" % "flink-streaming-scala_2.12" % flinkVersion % "provided"
In worst case if all these does not work then simply delete or rename these existing .sbt and .ivy2 hidden folder in your system home directory, where your all dependecies and plugins get sotred after downloading from maven central and then refresh/build the SBT project.
SBT dependency format
libraryDependencies += groupID % artifactID % revision % configuration
Meaning of % and %%
%: A method used to construct an Ivy Module ID from the strings you supply.
%%: When used after the groupID, it automatically adds your project’s Scala version (such as _2.12) to the end of the artifact name.
NOTE: To get more details click here.
summing up the comments since perhaps it is a bit hard to know what you should do
In general, if you get an "Unresolved dependencies" error, look at mvnrepository.com, search for your artifact:
https://mvnrepository.com/artifact/org.apache.flink/flink-scala
This tells you (second column) which Scala versions are supported by it. In this case, the library is available for 2.11.x and 2.12.x.
Thus, you have to use a Scala version compatible with that in your build, in build.sbt:
ThisBuild / scalaVersion := "2.12.10"

DataStax Cassandra - Scala Spark Application - SBT Build failure

I have a simple demo Scala application that reads from a file and outputs to the screen. I am trying to build this application using sbt and submit it to DataStax Spark. The SBT instructions in the DataStax documentation seems to be incomplete. https://docs.datastax.com/en/dse/6.0/dse-dev/datastax_enterprise/spark/sparkJavaApi.html
Using this as-is did not work because of the lacking link to datastax repo.
After searching around for a bit, found a sample build.sbt file from https://github.com/datastax/SparkBuildExamples/blob/master/scala/sbt/dse/build.sbt which went the furthest.
This one is failing here:
[error] unresolved dependency: org.apache.directory.api#api-ldap-codec-standalone;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-codec;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-net-mina;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-codec-core;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-aci;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-codec-api;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-model;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-asn1-ber;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-util;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-asn1-api;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-i18n;1.0.0.2.dse: not found
The key sections of build.sbt is:
scalaVersion := "2.11.8"
resolvers += Resolver.mavenLocal // for testing
resolvers += "DataStax Repo" at "https://repo.datastax.com/public-repos/"
val dseVersion = "6.0.0"
libraryDependencies += "com.datastax.dse" % "dse-spark-dependencies" % dseVersion % "provided" exclude(
"org.slf4j", "log4j-over-slf4j", "org.apache.directory.api")
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % "test"
).map(_.excludeAll(
ExclusionRule("org.slf4j","log4j-over-slf4j"),
ExclusionRule("org.slf4j","slf4j-log4j12"))
) // Excluded to allow for Cassandra to run embedded
Seems to be a broken dependency. Can you please advise.
Please try with following dependency
scalaVersion := "2.11.8"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.9"