Unknown artifact. Not resolved or indexed” - scala

this is my sbt file :
name := "spark-twitter-stream-example"
version := "1.0.0"
scalaVersion := "2.13.4"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming" % "2.0.1",
"org.apache.bahir" % "spark-streaming-twitter" % "2.0.1"
)
i have this error
Unknown artifact. Not resolved or indexed”
please how i can update my dependency in built.sbt
please i need solution

this is what worked for me ( new version)
name := "spark-twitter-stream-example"
version := "1.0.0"
scalaVersion := "2.12.4"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.0.1" % "provided"
libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.4.0"

Related

sbt complains about JmhPlugin not found

I'm trying to use sbt JmhPlugin and I'm following the instructions found here: https://github.com/sbt/sbt-jmh
So I added the plugin to project/plugins.sbt and then I added to build.sbt the enablePlugins(JmhPlugin) line so my build files look like this:
project/plugins.sbt:
addSbtPlugin("pl.project13.scala" % "sbt-jmh" % "0.4.4")
project/build.properties:
sbt.version = 1.8.2
build.sbt:
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.10"
lazy val root = (project in file("."))
.settings(
name := "myproj"
)
libraryDependencies += "org.scalactic" %% "scalactic" % "3.2.15"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.15" % "test"
libraryDependencies += "org.typelevel" %% "cats-effect" % "2.5.3"
val catsVersion = "2.9.0"
libraryDependencies += "org.typelevel" %% "cats-core" % catsVersion
libraryDependencies += "org.typelevel" %% "cats-free" % catsVersion
libraryDependencies += "org.typelevel" %% "cats-laws" % catsVersion
libraryDependencies += "org.typelevel" %% "cats-mtl-core" % "0.7.1"
libraryDependencies += "org.typelevel" %% "simulacrum" % "1.0.1"
libraryDependencies += "org.scalamacros" %% "resetallattrs" % "1.0.0"
libraryDependencies += "org.scalameta" %% "munit" % "0.7.22"
libraryDependencies += "org.typelevel" %% "discipline-munit" % "1.0.6"
scalacOptions ++= Seq(
"-deprecation",
"-encoding", "UTF-8",
"-feature",
"-language:_",
"-Ymacro-annotations"
)
enablePlugins(JmhPlugin)
but when I'm running sbt build it complains that it cannot find the JmhPlugin:
error: not found: value JmhPlugin
enablePlugins(JmhPlugin)
^
What am I doing wrong here? Also, how should I debug this issue?
Thanks!

sbt package not adding dependencies

I am trying to build jar using sbt package.
build.sbt:
name := "Simple Project"
version := "0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.3.2"
val connectorVersion = "2.3.0"
val cassandraVersion = "3.11"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"org.scalaj" %% "scalaj-http" % "2.4.2",
"com.datastax.spark" %% "spark-cassandra-connector" % connectorVersion
)
The sbt package runs successfully but does not add spark-cassandra-connector and scalaj-http to the final jar created.
Do I need to add anything?
If you want the jar to contain all your dependencies, you have to use the sbt assemlbly plugin:
https://github.com/sbt/sbt-assembly

build.sbt: how to add spark dependencies

Hello I am trying to download spark-core, spark-streaming, twitter4j, and spark-streaming-twitter in the build.sbt file below:
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
libraryDependencies ++= Seq(
"org.twitter4j" % "twitter4j-core" % "3.0.3",
"org.twitter4j" % "twitter4j-stream" % "3.0.3"
)
libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"
I simply took this libraryDependencies online so I am not sure which versions, etc. to use.
Can someone please explain to me how I should fix this .sbt files. I spent a couple hours trying to figure it out but none of the suggesstion worked. I installed scala through homebrew and I am on version 2.11.8
All of my errors were about:
Modules were resolved with conflicting cross-version suffixes.
The problem is that you are mixing Scala 2.11 and 2.10 artifacts. You have:
scalaVersion := "2.11.8"
And then:
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
Where the 2.10 artifact is being required. You are also mixing Spark versions instead of using a consistent version:
// spark 1.6.1
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
// spark 1.4.1
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
// spark 0.9.0-incubating
libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"
Here is a build.sbt that fixes both problems:
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)
You also don't need to manually add twitter4j dependencies since they are added transitively by spark-streaming-twitter.
It works for me:
name := "spark_local"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.twitter4j" % "twitter4j-core" % "3.0.5",
"org.twitter4j" % "twitter4j-stream" % "3.0.5",
"org.apache.spark" %% "spark-core" % "2.0.0",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"org.apache.spark" %% "spark-mllib" % "2.0.0",
"org.apache.spark" %% "spark-streaming" % "2.0.0"
)

error add libraryDependencies "io.prediction"

good afternoon,
i would ask about error when i tried added libraryDependencies "io.prediction" here's my code
name := "SBTMaret2016"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += toGroupID("org.scalatest") % "scalatest_2.10" % "2.0" % "test" (a)
libraryDependencies += toGroupID("io.prediction") %% "core" % "0.9.5" % "provided" (b)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-core" % "1.3.0" % "provided" (c)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-mllib" % "1.3.0" % "provided"(d)
first time, im add (a,c,d) and it work!
when im re-add (b) it doesn't work. any idea? thanks!
If you look at http://mvnrepository.com/artifact/io.prediction, there are no artifacts published for Scala 2.11. You can switch to 2.10.6 or try to get the source and compile it for 2.11. The Scalatest dependency is also incorrect; it should be toGroupID("org.scalatest") %% "scalatest". (Plus, you can remove all toGroupId calls; the usual way to write this is libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" % "provided".)

Trying to compile gensort.scala, getting: [error] impossible to get artifacts when data has not been loaded. IvyNode =net.java.dev.jets3t#jets3t;0.6.1

New to scala and sbt, not sure how to proceed. Am I missing more dependencies?
Steps to reproduce:
save gensort.scala code in ~/spark-1.3.0/project/
begin build: my-server$ ~/spark-1.3.0/project/sbt
> run
gensort.scala:
gensort source
build definition file in ~/spark-1.3.0/project/build.sbt:
lazy val root = (project in file(".")).
settings(
name := "gensort",
version := "1.0",
scalaVersion := "2.11.6"
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-examples_2.10" % "1.1.1",
"org.apache.spark" % "spark-core_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming-mqtt_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming_2.11" % "1.3.0",
"org.apache.spark" % "spark-network-common_2.10" % "1.2.0",
"org.apache.spark" % "spark-network-shuffle_2.10" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
Greatly appreciate any insight on how to move forward. Thx! -Dennis
You should not mix 2.10 and 2.11, they are not binary compatible. Your libraryDependencies should look like this:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-examples" % "1.1.1",
"org.apache.spark" %% "spark-core" % "1.3.0",
"org.apache.spark" %% "spark-streaming-mqtt" % "1.3.0",
"org.apache.spark" %% "spark-streaming" % "1.3.0",
"org.apache.spark" %% "spark-network-common" % "1.2.0",
"org.apache.spark" %% "spark-network-shuffle" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
The %% means that the Scala version is added as a suffix to the library id. After this change I got an error, because a dependency could not be found. It is located here:
resolvers += "poho" at "https://repo.eclipse.org/content/repositories/paho-releases"
Nevertheless, it seems that spark-examples is not available for 2.11. Changing the scalaVersion to
scalaVersion := "2.10.5"
solved all dependency problems and compilation succeeded successfully.