good afternoon,
i would ask about error when i tried added libraryDependencies "io.prediction" here's my code
name := "SBTMaret2016"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += toGroupID("org.scalatest") % "scalatest_2.10" % "2.0" % "test" (a)
libraryDependencies += toGroupID("io.prediction") %% "core" % "0.9.5" % "provided" (b)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-core" % "1.3.0" % "provided" (c)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-mllib" % "1.3.0" % "provided"(d)
first time, im add (a,c,d) and it work!
when im re-add (b) it doesn't work. any idea? thanks!
If you look at http://mvnrepository.com/artifact/io.prediction, there are no artifacts published for Scala 2.11. You can switch to 2.10.6 or try to get the source and compile it for 2.11. The Scalatest dependency is also incorrect; it should be toGroupID("org.scalatest") %% "scalatest". (Plus, you can remove all toGroupId calls; the usual way to write this is libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" % "provided".)
Related
I am facing issue when trying to import neo4j spark connector into my IntelliJ project in sbt.
My build.sbt file looks like this:
name := "shortestneo4j"
version := "0.1"
scalaVersion := "2.11.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-graphx
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "2.4.0"
// https://mvnrepository.com/artifact/neo4j-contrib/neo4j-spark-connector
libraryDependencies += "neo4j-contrib" % "neo4j-spark-connector" % "2.4.5-M1"
I get an error from the last line. I am using Java 1.8 and Spark 2.4.0.
I have been trying all day and cannot figure out how to make it work.
So I have a common library that will be my core lib for spark.
My build.sbt file is not working:
name := "CommonLib"
version := "0.1"
scalaVersion := "2.12.5"
// addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")
// resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"
// resolvers += Resolver.sonatypeRepo("public")
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"org.apache.hadoop" % "hadoop-common" % "2.7.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
// "org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-hive_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-yarn_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"),
"com.github.scopt" %% "scopt" % "3.7.0"
)
//addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")
//libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
//libraryDependencies ++= {
// val sparkVer = "2.1.0"
// Seq(
// "org.apache.spark" %% "spark-core" % sparkVer % "provided" withSources()
// )
//}
All the commented out are all the test I've done and I don't know what to do anymore.
My goal is to have spark 2.3 to work and to have scope available too.
For my sbt version, I have 1.1.1 installed.
Thank you.
I think I had two main issues.
Spark is not compatible with scala 2.12 yet. So moving to 2.11.12 solved one issue
The second issue is that for intelliJ SBT console to reload the build.sbt you either need to kill and restart the console or use the reload command which I didnt know so I was not actually using the latest build.sbt file.
There's a Giter8 template that should work nicely:
https://github.com/holdenk/sparkProjectTemplate.g8
I work on spark application using (spark 2.0.0 & scala 2.11.8) and the application works fine within intellij Idea environment. I've extracted application as jar file and tried to run spark application from jar file but this error raised on terminal:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1632)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at Main$.main(Main.scala:26)
at Main.main(Main.scala)
I've read discussions and similar question but all of them talk about different scala versions, however my sbt file is this:
name := "BaiscFM"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.0.0"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.0.0"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.11" % "2.4.17"
libraryDependencies += "net.liftweb" % "lift-json_2.11" % "2.6"
libraryDependencies += "com.typesafe.play" % "play-json_2.11" % "2.4.0-M2"
libraryDependencies += "org.json" % "json" % "20090211"
libraryDependencies += "org.scalaj" % "scalaj-http_2.11" % "2.3.0"
libraryDependencies += "org.drools" % "drools-core" % "6.3.0.Final"
libraryDependencies += "org.drools" % "drools-compiler" % "6.3.0.Final"
How to fix this problem?
I get error, just like the title. I'm already research, and found some similar, but its not working on me.
NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor conflits on Elastic Search jar
Java elasticsearch client always null
https://github.com/elastic/elasticsearch/pull/7593
java.lang.NoSuchMethodError during Elastic search start
https://discuss.elastic.co/t/transportclient-in-2-1-x/38818/6
I'm using Scala as programming language to create API, and Elasticsearch as database.
here is my code build.sbt
name := "LearningByDoing"
version := "1.0"
scalaVersion := "2.10.5"
resolvers += "spray repo" at "http://repo.spray.io"
resolvers += "spray nightlies repo" at "http://nightlies.spray.io"
libraryDependencies += "io.spray" % "spray-json_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-client_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-testkit_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-routing_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-http_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-httpx_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-util_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.12"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.1"
libraryDependencies += "com.sksamuel.elastic4s" % "elastic4s-streams_2.10" % "2.3.1"
libraryDependencies += "org.elasticsearch" % "elasticsearch-mapper-attachments" % "2.3.1"
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.1"
Here is my code plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0-M4")
addSbtPlugin("com.typesafe.sbt" % "sbt-multi-jvm" % "0.3.9")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
at terminal, I was written sbt clean compile test update package and everything works normal. but when I hit the API is always come error like that.
Seems like you have some wrong guava version, just like the firs link you mentioned, may be with this sbt plugin you can see the dependency tree and figured it out some messing dependencies.
The issue is the TCP client for Elasticsearch since 5.0 uses Netty 4.1, which is incompatible with Spray which uses Netty 4. There is no workaround other than waiting for Spray to upgrade or switching to an elasticsearch HTTP client.
Hello I am trying to download spark-core, spark-streaming, twitter4j, and spark-streaming-twitter in the build.sbt file below:
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
libraryDependencies ++= Seq(
"org.twitter4j" % "twitter4j-core" % "3.0.3",
"org.twitter4j" % "twitter4j-stream" % "3.0.3"
)
libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"
I simply took this libraryDependencies online so I am not sure which versions, etc. to use.
Can someone please explain to me how I should fix this .sbt files. I spent a couple hours trying to figure it out but none of the suggesstion worked. I installed scala through homebrew and I am on version 2.11.8
All of my errors were about:
Modules were resolved with conflicting cross-version suffixes.
The problem is that you are mixing Scala 2.11 and 2.10 artifacts. You have:
scalaVersion := "2.11.8"
And then:
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
Where the 2.10 artifact is being required. You are also mixing Spark versions instead of using a consistent version:
// spark 1.6.1
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
// spark 1.4.1
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
// spark 0.9.0-incubating
libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"
Here is a build.sbt that fixes both problems:
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)
You also don't need to manually add twitter4j dependencies since they are added transitively by spark-streaming-twitter.
It works for me:
name := "spark_local"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.twitter4j" % "twitter4j-core" % "3.0.5",
"org.twitter4j" % "twitter4j-stream" % "3.0.5",
"org.apache.spark" %% "spark-core" % "2.0.0",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"org.apache.spark" %% "spark-mllib" % "2.0.0",
"org.apache.spark" %% "spark-streaming" % "2.0.0"
)