sbt unable to import apache library - scala

i am trying to setup a simple kafka consumer app , which consumes messages from secure HTTPS kafka cluster .
here is my sbt build file .
version := "0.1"
//libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka
libraryDependencies += "org.apache.kafka" %% "kafka" % "6.1.0-ccs"
d
scalaVersion := "2.13.6"
my actual consumer code is ..
package main.scala.kafka
import java.util
import java.util.Properties
import org.apache.kafka.clients.consumer.KafkaConsumer
object consumer extends App {
val TOPIC="amg-dev-time"
val props = new Properties()
props.put("bootstrap.servers", "kafka-localhost.net:9093")
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
props.put("group.id", "amlng-dev-realtime")
val consumer = new KafkaConsumer[String, String](props)
consumer.subscribe(util.Collections.singletonList(TOPIC))
while(true){
val records=consumer.poll(100)
for (record<-records.asScala){
println(record)
}
}
}
when i run the above setup i am getting
object apache is not a member of package org
import org.apache.kafka.clients.consumer.KafkaConsumer
help me fix this and help me with an example on how to connect to secure kafka cluster to consume messages in scala.
upon refreshing sbt build file.. i am getting ..
[error] Not found
[error] Not found
[error] not found: /Users/h0j020h/.ivy2/localorg.apache.kafka/kafka_2.13/6.1.0-ccs/ivys/ivy.xml
[error] download error: Caught javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target (PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target) while downloading https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.13/6.1.0-ccs/kafka_2.13-6.1.0-ccs.pom
[error] not found: https://repository.com/content/repositories/pangaea_releases/org/apache/kafka/kafka_2.13/6.1.0-ccs/kafka_2.13-6.1.0-ccs.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka_2.13:6.1.0-ccs
[error] Not found
[error] Not found
[error] not found: /Users/h0j020h/.ivy2/localorg.apache.kafka/kafka_2.13/6.1.0-ccs/ivys/ivy.xml
[error] download error: Caught javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target (PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target) while downloading https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.13/6.1.0-ccs/kafka_2.13-6.1.0-ccs.pom
[error] not found: https://repository.com/content/repositories/pangaea_releases/org/apache/kafka/kafka_2.13/6.1.0-ccs/kafka_2.13-6.1.0-ccs.pom
[error] Total time: 30 s, completed 19-Oct-2021, 6:21:47 pm
[info] shutting down sbt server```

Kafka 6.1.0-ccs is hosted on Confluent.
Try to add:
resolvers += "confluent" at "https://packages.confluent.io/maven/"
It should solve the problem (example on scastie)

Related

intellij: errors with scala, spark

I have a scala project using spark libraies, and it works fine most of the times(using intellij). But some times it starts giving errors on intellij launch:
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] Total time: 1 s, completed 17 Sep 2022, 15:13:49
[info] shutting down sbt server
build.sbt is:
/*ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.8"
lazy val root = (project in file("."))
.settings(
name := "spark-learning"
)*/
// Name of the package
name := "spark-learning"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.0-preview2",
"org.apache.spark" %% "spark-sql" % "3.0.0-preview2"
)
What causes these issues out of sudden?And how can I get rid?
One weird thing in the error messages is the path where it looks for dependencies: ...ivy2\localorg.apache.spark\....
There should be a \ after local.
Any chance your SBT repositories configuration could be messed up? Not sure where it is on Windows, it's /etc/sbt/repositories typically on Linux but maybe Intellij has its own setting.

Unresolved dependencies path SBT - Scala Intellij Project

I have have newly installed and created spark, scala, SBT development environment in intellij but when i am trying to compile SBT, getting unresolved dependencies error.
below is my SBT file
name := "xxxxxxxxxxxxxxxxxxxx"
version := "0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.3.1"
val jacksonCore = "2.6.7"
val publishMavenStyle = true
resolvers ++= Seq(
"Artifactory" at "https://binrepo.xxxxxx.com/artifactory/xyz/",
"Artifactory Common" at "https://binrepo.xxxxxx.com/artifactory/data-engineering-gscl-abc/",
//"ArtifactorySnapShots" at "https://binrepo.xxxxxx.com/artifactory/xyz/SNAPSHOTS/"
("Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven").withAllowInsecureProtocol(true)
)
libraryDependencies ++= Seq(
"org.scalamock" %% "scalamock" % "4.4.0" % Test,
//dependency of xyz-core.
"com.tgt.dsc.xyz.datapipeline" % "xyz-core_2.11" % "1.8.1",
//dependency of fields performance specific common function
"abc-common" % "abc-common_2.11" % "3.0.0",
//dependency of reading configuration
"com.typesafe" % "config" % "1.3.3",
//spark core libraries, in the production or for spark-submit in local add provided so that dependent jar is not part of assembly jar
// ex :"org.apache.spark" %% "spark-core" % sparkVersion % provided,
"org.apache.spark" %% "spark-core" % sparkVersion % Provided ,
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided ,
"org.apache.spark" %% "spark-hive" % sparkVersion % Provided ,
"org.scala-lang" % "scala-library" % scalaVersion.value,
//logging library
"org.slf4j" % "slf4j-api" % "1.7.29",
//for doing testing
"org.scalatest" %% "scalatest" % "3.1.0" % Test,
"MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
"mrpowers" % "spark-daria" % "0.35.0-s_2.11"
)
enablePlugins(GitVersioning)
assemblyJarName in assembly := s"${name.value}_${scalaVersion.value}-${version.value}.jar"
assemblyMergeStrategy in assembly := {
//case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case x => MergeStrategy.first
}
//get the token from secrets
val token = sys.env.getOrElse("SONAR_TOKEN", "")
//configurations for sonar integration
sonarProperties ++= Map(
"sonar.host.url" -> "http://sonarqube.xxxxxx.com:9000",
"sonar.scala.version" -> "2.11",
"sonar.projectName" -> "xyz-starter",
"sonar.projectKey" -> "xyz-starter",
"sonar.sources" -> "src/main/scala",
"sonar.tests" -> "src/test/scala",
"sonar.scala.coverage.reportPaths" -> "xxxxxx/scala-2.11/coverage-report/cobertura.xml,xxxxxx/scala-2.11/scapegoat-report/scapegoat.xml",
"sonar.login" -> token,
"sonar.buildbreaker.skip" -> "false"
)
//how much code coverage is needed
coverageMinimum := 80
//fail the build if code coverage is not met
coverageFailOnMinimum := true
Entire sbt file is showing in red including the name, version, scalaVersion
when i compile following is the error which i am getting now
/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.4.5 (AdoptOpenJDK Java 11.0.9)
[info] loading global plugins from /Users/xxxxxxx/.sbt/1.0/plugins
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx-build from assembly.sbt ...
[info] loading project definition from /Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/project
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx from build.sbt ...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] sbt server started at local:///Users/xxxxxxx/.sbt/1.0/server/5231834612cf96406db7/sock
[info] started sbt server
sbt:xxxxxxxxxxxxxxxxxxxxxxxxxx>
;set _root_.scala.collection.Seq(historyPath := None,shellPrompt := { _ => "" }
,SettingKey[_root_.scala.Option[_root_.sbt.File]]("sbtStructureOutputFile")
in _root_.sbt.Global := _root_.scala.Some(_root_.sbt.file("/private/var/folders/6p/qvsthwj11q38nxlpn72hv6fr0000gq/T/sbt-structure.xml"))
,SettingKey[_root_.java.lang.String]
("sbtStructureOptions")
in _root_.sbt.Global := "download, resolveClassifiers")
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error] Not found
[error] Not found
[error] not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] Total time: 7 s, completed 19-May-2021, 6:44:44 PM
[info] shutting down sbt server
Any idea on how to resolve it.
Adding the Screenshot of how my console looks
Entire sbt file is showing in red including the name, version, scalaVersion
This is likely caused by some missing configuration in IntelliJ, you should have some kind of popup that aks you to "configure Scala SDK". If not, you can go to your module settings and add the Scala SDK.
when i compile following is the error which i am getting now
If you look closely to the error, you should notice this message:
Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
The dependencies you are looking for ("MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11" and "mrpowers" % "spark-daria" % "0.35.0-s_2.11") are only available in this repository http://dl.bintray.com/spark-packages/maven/ which seems to require some authentication as the HTTP error code 403 suggests.
Either you can configure authentication, or you can use more recent versions of these libraries that are published on the public Maven central repository:
"com.github.mrpowers" %% "spark-daria" % "0.39.0"
"com.github.mrpowers" %% "spark-fast-tests" % "0.23.0"
EDIT: how did I find that? I used https://mvnrepository.com/search?q=spark-daria to search for your dependencies and found the new ones with the Repository Central flag
Also note there are few things that you may want to change in your build.sbt:
Use the "sbt scheme" to use Scala dependencies without manually setting the Scala version:
use "com.github.mrpowers" %% "spark-daria" % "0.39.0"
instead of "com.github.mrpowers" % "spark-daria" % "0.39.0_2.11" (notice the double %% and the absence of suffix _2.11)
Do not declare the Scala library dependency "org.scala-lang" % "scala-library" % scalaVersion.value, this is implicitly provided
The Issue was related to the missing dependency i.e.
"MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
"mrpowers" % "spark-daria" % "0.35.0-s_2.11"
After removing this from the code I found that it was also being used in the other dependent JAR
//dependency of fields performance specific common function
"abc-common" % "abc-common_2.11" % "3.0.0",
Removing from there solved the issue.

sbt error: object spark is not a member of package org.apache

I installed sbt-1.3.4.msi and when trying to build a sample SparkPi.scala app, I'm getting the following error:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
The SparkPi.scala file is in C:\myapps\sbt\sparksample\project\src\main\scala (as shown in the error messages above).
What am I missing here?
The C:\myapps\sbt\sparksample\sparksample.sbt file is as follows:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file
That's the problem. You've got the Scala file(s) under project directory that's owned by sbt itself (not your sbt-managed Scala project).
Move the SparkPi.scala and other Scala files to C:\myapps\sbt\sparksample\src\main\scala.

Error downloading net.cakesolutions:scala-kafka-client - Not Found

I'm trying to install Kafka in my sbt, but when I click on "import changes" I'm getting an error:
[error] stack trace is suppressed; run 'last update' for the full
output [error] stack trace is suppressed; run 'last
ssExtractDependencies' for the full output [error] (update)
sbt.librarymanagement.ResolveException: Error downloading
net.cakesolutions:scala-kafka-client_2.13:2.3.1 [error] Not found
[error] Not found [error] not found:
C:\Users\macca.ivy2\local\net.cakesolutions\scala-kafka-client_2.13\2.3.1\ivys\ivy.xml
[error] not found:
https://repo1.maven.org/maven2/net/cakesolutions/scala-kafka-client_2.13/2.3.1/scala-kafka-client_2.13-2.3.1.pom
[error] (ssExtractDependencies)
sbt.librarymanagement.ResolveException: Error downloading
net.cakesolutions:scala-kafka-client_2.13:2.3.1 [error] Not found
[error] Not found [error] not found:
C:\Users\macca.ivy2\local\net.cakesolutions\scala-kafka-client_2.13\2.3.1\ivys\ivy.xml
[error] not found:
https://repo1.maven.org/maven2/net/cakesolutions/scala-kafka-client_2.13/2.3.1/scala-kafka-client_2.13-2.3.1.pom
[error] Total time: 1 s, completed 19:56:34 26/04/2020 [info] shutting
down sbt server
build.sbt:
name := "KafkaProducer"
version := "0.1"
scalaVersion := "2.13.0"
libraryDependencies ++= Seq(
"io.circe" %% "circe-parser" % "0.12.3",
"net.cakesolutions" %% "scala-kafka-client" % "2.3.1"
)
Per the github page for scala-kafka-client, you'll need to add a bintray resolver to your build.sbt:
resolvers += Resolver.bintrayRepo("cakesolutions", "maven")
As of today Scala is still not binary compatible between versions and has tendency to serious breaking changes between "minor" (2.10 -> 2.11 -> 2.12 -> 2.13) releases.
It leads to situation where maintainers are relatively slow in adopting new versions.
e.g. Apache Spark barely started supporting 2.12 in the last stable version.
And even to the point where it is a default one.
So if I want to run this with 2.13 I have three options:
sbt publish-local
Using standard Java client instead
Nagging maintainer of Scala package to publish artificats
But I've decided to solve it by just downgrading Scala to 2.12

Scala not working with heroku example

I'm following a tutorial to create a scala web app with Heroku here: https://devcenter.heroku.com/articles/scala
I've copied there example exactly, but when I run
sbt clean compile stage
It fails to compile because of these errors:
[error] /home/ajcrites/dev/dyl/src/main/scala/Web.scala:1: object jboss is not a member of package org
[error] import org.jboss.netty.handler.codec.http.{HttpRequest, HttpResponse}
[error] ^
[error] /home/ajcrites/dev/dyl/src/main/scala/Web.scala:2: object twitter is not a member of package com
[error] import com.twitter.finagle.builder.ServerBuilder
[error] ^
[error] /home/ajcrites/dev/dyl/src/main/scala/Web.scala:3: object twitter is not a member of package com
[error] import com.twitter.finagle.http.{Http, Response}
[error] ^
[error] /home/ajcrites/dev/dyl/src/main/scala/Web.scala:4: object twitter is not a member of package com
[error] import com.twitter.finagle.Service
[error] ^
[error] /home/ajcrites/dev/dyl/src/main/scala/Web.scala:5: object twitter is not a member of package com
[error] import com.twitter.util.Future
[error] ^
[error] 5 errors found
Basically, I think it has to do with finagle not being available or not in the packages I have or something. However, I have no idea how to install finagle and there are neither instructions in the tutorial above nor at https://github.com/twitter/finagle
What can I do to get this to compile?
If will depend on the version of Scala and Finagle you want to use, but to add Finagle to the project, just add the following to build.sbt
libraryDependencies += "com.twitter" % "finagle-core_2.9.1" % "1.11.0" exclude("org.apache.thrift", "libthrift")
libraryDependencies += "com.twitter" % "finagle-http_2.9.1" % "1.11.0"
libraryDependencies += "com.twitter" % "finagle-serversets_2.9.1" % "1.11.0" excludeAll(
ExclusionRule(organization = "com.sun.jdmk"),
ExclusionRule(organization = "com.sun.jmx"),
ExclusionRule(organization = "javax.jms")
)
This example is about 3 months old, so I'm sure you can get a newer version of Finagle.
I tried the code and it worked for me. Perhaps see if the source on GitHub works: https://github.com/heroku/devcenter-scala