Similarly to:
Why is UNRESOLVED DEPENDENCIES error with com.typesafe.slick#slick_2.11;2.0.2: not found?
I got the next error message:
events/*:update) sbt.ResolveException: unresolved dependency: com.typesafe.slick#slick-extensions_2.11;3.1.0: not found
My scala build.sbt has:
lazy val events = (project in file("modules/events")).settings(commonSettings).
settings(Seq(libraryDependencies ++= Seq(
cache,
ws,
evolutions,
specs2,
"com.softwaremill.macwire" %% "macros" % "2.2.5" % "provided",
"com.softwaremill.macwire" %% "util" % "2.2.0",
"ch.qos.logback" % "logback-classic" % "1.1.8",
"de.svenkubiak" % "jBCrypt" % "0.4.1",
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.0" % "test",
"org.mockito" % "mockito-core" % "2.0.45-beta" % "test",
"mysql" % "mysql-connector-java" % "5.1.34",
"org.postgresql" % "postgresql" % "9.4.1207.jre7",
"com.vividsolutions" % "jts" % "1.13",
"com.typesafe.play" % "play-slick_2.11" % "2.0.2",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0",
"com.github.tminglei" %% "slick-pg" % "0.12.1",
"com.github.tminglei" %% "slick-pg_date2" % "0.12.1",
"com.github.tminglei" %% "slick-pg_play-json" % "0.12.1",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"org.scalikejdbc" %% "scalikejdbc" % "2.4.2",
"org.scalikejdbc" %% "scalikejdbc-config" % "2.4.2",
"joda-time" % "joda-time" % "2.9.4",
"com.typesafe.play" %% "play-json" % "2.5.9",
"io.circe" %% "circe-core" % circeVersion,
"io.circe" %% "circe-generic" % circeVersion,
"io.circe" %% "circe-parser" % circeVersion,
"io.circe" %% "circe-jawn" % circeVersion,
"com.github.julien-truffaut" %% "monocle-core" % monocleVersion,
"com.github.julien-truffaut" %% "monocle-macro" % monocleVersion,
"com.github.julien-truffaut" %% "monocle-law" % monocleVersion % "test",
"com.microsoft.sqlserver" % "mssql-jdbc" % "7.4.1.jre8"
)))
I am also using Scala 2.11.9. I also tried adding
resolvers += "typesafe" at "http://repo.typesafe.com/typesafe/releases/"
but no luck. Any suggestions, please?
Actually slick-extensions is not located in http://repo.typesafe.com/typesafe/releases/. If you will look there you will see that com/typesafe/slick/slick-extensions_2.11/ is empty.
But I have found it here https://typesafe.bintray.com/commercial-maven-releases/com/typesafe/slick/slick-extensions_2.11/3.1.0/
And here some information about slick-extensions: https://index.scala-lang.org/slick/slick/slick-extensions/3.1.0.
They recommend using that:
libraryDependencies += "com.typesafe.slick" %% "slick-extensions" % "3.1.0"
resolvers += Resolver.bintrayRepo("typesafe", "commercial-maven-releases")
Related
I am trying to run the code in Intelli but getting below error.Please help me find the error "Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)". Please help me to find the issue. My scala version is 2.11.12 and spark 2.4.4
Metorikku$: Starting Metorikku - Parsing configuration
ConfigurationParser$: Starting Metorikku - Parsing configuration
Exception in thread "main" java.lang.NoSuchMethodError:scala.Product.$init$(Lscala/Product;)V
at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:810)
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:644)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:95)
at org.apache.spark.SparkConf.$anonfun$loadFromSystemProperties$3(SparkConf.scala:77)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:927)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at com.yotpo.metorikku.utils.FileUtils$.getHadoopPath(FileUtils.scala:63)
at com.yotpo.metorikku.utils.FileUtils$.readFileWithHadoop(FileUtils.scala:72)
at com.yotpo.metorikku.utils.FileUtils$.readConfigurationFile(FileUtils.scala:56)
at com.yotpo.metorikku.configuration.job.ConfigurationParser$.parse(ConfigurationParser.scala:34)
at com.yotpo.metorikku.Metorikku$.delayedEndpoint$com$yotpo$metorikku$Metorikku$1(Metorikku.scala:12)
at com.yotpo.metorikku.Metorikku$delayedInit$body.apply(Metorikku.scala:9)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at com.yotpo.metorikku.Metorikku$.main(Metorikku.scala:9)
at com.yotpo.metorikku.Metorikku.main(Metorikku.scala)
built.sbt file
scalaVersion := "2.11.12"
val sparkVersion = Option(System.getProperty("sparkVersion")).getOrElse("2.4.5")
val jacksonVersion = "2.9.9"
lazy val excludeJpountz = ExclusionRule(organization = "net.jpountz.lz4", name = "lz4")
lazy val excludeNetty = ExclusionRule(organization = "io.netty", name = "netty")
lazy val excludeNettyAll = ExclusionRule(organization = "io.netty", name = "netty-all")
lazy val excludeAvro = ExclusionRule(organization = "org.apache.avro", name = "avro")
lazy val excludeSpark = ExclusionRule(organization = "org.apache.spark")
lazy val excludeFasterXML = ExclusionRule(organization = "com.fasterxml.jackson.module", name= "jackson-module-scala_2.12")
lazy val excludeMetricsCore = ExclusionRule(organization = "io.dropwizard.metrics", name= "metrics-core")
lazy val excludeLog4j = ExclusionRule(organization = "org.apache.logging.log4j")
lazy val excludeParquet = ExclusionRule(organization = "org.apache.parquet")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-mllib" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion % "provided" excludeAll(excludeJpountz),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"org.apache.spark" %% "spark-avro" % sparkVersion % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % "2.4.2",
"com.holdenkarau" %% "spark-testing-base" % "2.4.3_0.12.0" % "test",
"com.github.scopt" %% "scopt" % "3.6.0",
"RedisLabs" % "spark-redis" % "0.3.2",
"org.json4s" %% "json4s-native" % "3.5.2",
"io.netty" % "netty-all" % "4.1.32.Final",
"io.netty" % "netty" % "3.10.6.Final",
"com.google.guava" % "guava" % "16.0.1",
"com.typesafe.play" %% "play-json" % "2.6.2",
"com.databricks" %% "spark-redshift" % "3.0.0-preview1" excludeAll excludeAvro,
"com.amazon.redshift" % "redshift-jdbc42" % "1.2.1.1001",
"com.segment.analytics.java" % "analytics" % "2.0.0",
"org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.6",
"org.scala-lang" % "scala-compiler" % "2.11.12",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % jacksonVersion,
"com.fasterxml.jackson.dataformat" % "jackson-dataformat-cbor" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-core" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-annotations" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,
"com.fasterxml.jackson.dataformat" % "jackson-dataformat-yaml" % jacksonVersion,
"com.groupon.dse" % "spark-metrics" % "2.0.0" excludeAll excludeMetricsCore,
"org.apache.commons" % "commons-text" % "1.6",
"org.influxdb" % "influxdb-java" % "2.14",
"org.apache.kafka" %% "kafka" % "2.2.0" % "provided",
"za.co.absa" % "abris_2.11" % "3.1.1" % "provided" excludeAll(excludeAvro, excludeSpark),
"org.apache.hudi" %% "hudi-spark-bundle" % "0.5.2-incubating" "provided" excludeAll excludeFasterXML,
"org.apache.parquet" % "parquet-avro" % "1.10.1" % "provided",
"org.apache.avro" % "avro" % "1.8.2" % "provided",
"org.apache.hive" % "hive-jdbc" % "2.3.3" % "provided" excludeAll(excludeNetty, excludeNettyAll, excludeLog4j, excludeParquet),
"org.apache.hadoop" % "hadoop-aws" % "2.7.3" % "provided"
)
I try to start the sample test NG test in the scala sbt framework. I am using below the dependencies.
"org.apache.spark" %% "spark-core" % sparkVer % Provided,
"org.apache.hadoop" % "hadoop-common" % sparkVer % Provided,
"org.apache.spark" % "spark-sql_2.11" % sparkVer % Provided,
"org.apache.spark" % "spark-hive_2.11" % sparkVer,
"org.scalactic" %% "scalactic" % scalatestVer,
"org.scalatest" %% "scalatest" % scalatestVer % Test,
"info.cukes" % "cucumber-scala_2.11" % cucumberVer,
"info.cukes" % "cucumber-junit" % cucumberVer,
"junit" % "junit" % "4.12",
"org.scalatest" % "scalatest_2.11" % "2.0" % "test",
"org.scalactic" %% "scalactic" % "3.0.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
[The image is ExampleSuite class as you can see I am not able to use it correctly1
Here is the link I follow up for this case http://www.scalatest.org/getting_started_with_testng_in_scala
Here is also sbt.version = 0.13.16
Any help really appreciate.
Add dependencies => "org.testng" % "testng" % "6.14.3" % Test, and "create TestNG XML" plugin upload from marketplace than resolve the issue.
How can I list all file names of parquet files in the S3 directory in Amazon?
I found this way:
val s3 = AmazonS3ClientCuilder.standard.build()
var objs = s3.listObjects("bucketname","directory")
val summaries = objs.getObjectSummaries()
while (objs.isTruncated()) {
objs = s3.listNextBatchOfObjects(objs)
summaries.addAll(objs.getObjectSummaries())
}
val listOfFiles = summaries.toArray
But it throws the error:
java.lang.NoSuchMethodError: org.apache.http.conn.ssl.SSLConnectionSocketFactory
I added the dependency for httpclient 4.5.2 as indicated in many answers, but I still get the same error.
Also I did:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion exclude("commons-httpclient", "commons-httpclient"),
"org.apache.spark" %% "spark-mllib" % sparkVersion exclude("commons-httpclient", "commons-httpclient"),
"org.sedis" %% "sedis" % "1.2.2",
"org.scalactic" %% "scalactic" % "3.0.0",
"org.scalatest" %% "scalatest" % "3.0.0" % "test",
"com.github.nscala-time" %% "nscala-time" % "2.14.0",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.53",
"org.apache.httpcomponents" % "httpclient" % "4.5.2",
"net.java.dev.jets3t" % "jets3t" % "0.9.3",
"org.apache.hadoop" % "hadoop-aws" % "2.6.0",
"com.github.scopt" %% "scopt" % "3.3.0"
)
The following is a section of the build.sbt file that I recently added to my project to build an API for the application.
libraryDependencies ++= Seq(
"com.chuusai" %% "shapeless" % "2.3.1"
)
libraryDependencies ++= {
val sprayVersion = "1.3.1"
val akkaVersion = "2.3.4"
Seq(
"io.spray" % "spray-can" % sprayVersion,
"io.spray" % "spray-routing" % sprayVersion,
"io.spray" % "spray-testkit" % sprayVersion,
"io.spray" % "spray-client" % sprayVersion,
"io.spray" %% "spray-json" % "1.3.1",
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.12",
"org.scalatest" %% "scalatest" % "3.0.4" % "test"
)
}
Although, when importing the libraries the following errors are generated.
[error] (*:update) Conflicting cross-version suffixes in: com.chuusai:shapeless, com.typesafe.akka:akka-actor, com.typesafe.akka:akka-testkit
[error] (*:ssExtractDependencies) Conflicting cross-version suffixes in: com.chuusai:shapeless, com.typesafe.akka:akka-actor, com.typesafe.akka:akka-testkit
A suggestion for the suitable libraries, having compatibility is highly appreciated. I'm using Spark 2.2.0, Scala 2.11.11.
The combination should be :
libraryDependencies ++= Seq(
"com.chuusai" %% "shapeless" % "2.3.1"
)
libraryDependencies ++= {
val sprayVersion = "1.3.4"
val akkaVersion = "2.5.4"
Seq(
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion,
"io.spray" %% "spray-testkit" % sprayVersion,
"io.spray" %% "spray-client" % sprayVersion,
"io.spray" %% "spray-json" % "1.3.3",
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.12",
"org.scalatest" %% "scalatest" % "3.2.0-SNAP7" % "test"
)
}
I am new in using Play framework. I am trying to generate Q classes in play framework project. When I compile program using activator by running command compile it did not generate Q classes.
My build.sbt content is
import codetroopers.QueryDSLPlugin
import play.PlayJava
name := """trainers"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava, QueryDSLPlugin)
scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
cache,
javaWs,
filters,
javaJdbc,
javaJpa.exclude("org.hibernate.javax.persistence", "hibernate-jpa-2.0-api"),
"org.hibernate" % "hibernate-entitymanager" % "4.3.6.Final",
"org.hibernate" % "hibernate-core" % "4.3.6.Final",
"org.hibernate" % "hibernate-validator" % "5.2.0.Beta1",
"javax.el" % "javax.el-api" % "2.2.4",
"commons-io" % "commons-io" % "2.3",
"com.google.inject" % "guice" % "3.0",
"com.google.inject.extensions" % "guice-multibindings" % "3.0",
"org.postgresql" % "postgresql" % "9.3-1101-jdbc41",
"com.mysema.querydsl" % "querydsl-apt" % "3.6.3",
"com.mysema.querydsl" % "querydsl-jpa" % "3.6.3",
"com.typesafe.play.plugins" %% "play-plugins-mailer" % "2.3.1",
"org.apache.commons" % "commons-collections4" % "4.0",
"org.mindrot" % "jbcrypt" % "0.3m",
"com.typesafe.play" %% "play-mailer" % "2.4.0",
"redis.clients" % "jedis" % "2.7.0",
"com.restfb" % "restfb" % "1.14.0",
"org.json" % "org.json" % "chargebee-1.0",
"com.fasterxml.jackson.core" % "jackson-core" % "2.5.3",
"net.greghaines" % "jesque" % "2.0.2",
"org.apache.velocity" % "velocity" % "1.7",
"oro" % "oro" % "2.0.8",
"org.liquibase" % "liquibase-core" % "3.2.2",
"com.twilio.sdk" % "twilio-java-sdk" % "4.4.1",
"com.googlecode.libphonenumber" % "libphonenumber" % "7.0.7",
"com.google.code.gson" % "gson" % "2.2.2",
"com.paypal.sdk" % "rest-api-sdk" % "1.2.5",
"com.paypal.sdk" % "paypal-core" % "1.6.4",
"com.paypal.sdk" % "merchantsdk" % "2.13.117"
)
javaOptions in Test += "-Dconfig.file=test/conf/application.test.conf"
queryDSLVersion := "3.6.3"
My plugin.sbt:
resolvers += "Typesafe repository" at "https://repo.typesafe.com/typesafe/releases/"
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.8")
// web plugins
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.0.0")
addSbtPlugin("com.code-troopers.play" % "play-querydsl" % "0.1.2")
Do I need to run any specific command to generate the classes. My Scala version is 2.11.7. My Activator version is 1.3.6.