scala integration tests "No such setting/task" - scala

Integration test configuration is new to me.
I cannot get my scalatest integration tests to run (on sbt or intellij).
My unittests in src/test/scala run fine.
My integration tests are in src/it/scala
If I run with sbt it:test the error is "No such setting/task"
If I run on intellij (i.e., with the 'run' button), I get
Unable to load a Suite class. This could be due to an error in your runpath. Missing class: xxx.tools.es_ingester.EsIntegrationSpec
java.lang.ClassNotFoundException: xxx.tools.es_ingester.ConfluenceEsIntegrationSpec
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
The class, however, is clearly in /src/it/scala/xxx/tools/es_ingester.
update: build.sbt
name := "xxx.tools.data_extractor"
version := "0.1"
organization := "xxx.tools"
scalaVersion := "2.11.12"
sbtVersion := "1.2.7"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.5"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.25"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.2"
libraryDependencies += "commons-io" % "commons-io" % "2.6"
libraryDependencies += "org.bouncycastle" % "bcprov-jdk15on" % "1.61"
libraryDependencies += "org.mockito" % "mockito-core" % "2.24.0" % Test
libraryDependencies += "com.typesafe" % "config" % "1.3.3"
libraryDependencies += "com.typesafe.play" %% "play" % "2.7.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-client" % "6.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "6.6.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-high-level-client" % "6.6.0"
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.3"

You have not added configuration for integration test.
For example, adding it in scala test or default settings etc.
"org.scalatest" %% "scalatest" % "3.0.5" % "it, test"
For all integration, settings refer
I hope, it will help.

Related

Unable to write files after scala and spark upgrade

My project was previously using Scala version 2.11.12 which I have upgraded to 2.12.10 and the Spark version has been upgraded from 2.4.0 to 3.1.2. See the build.sbt file below with the rest of the project dependencies and versions:
scalaVersion := "2.12.10"
val sparkVersion = "3.1.2"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-hive" % sparkVersion % "provided"
libraryDependencies += "org.xerial.snappy" % "snappy-java" % "1.1.4"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.8"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % "test, it"
libraryDependencies += "com.holdenkarau" %% "spark-testing-base" % "3.1.2_1.1.0" % "test, it"
libraryDependencies += "com.github.pureconfig" %% "pureconfig" % "0.12.1"
libraryDependencies += "com.typesafe" % "config" % "1.3.2"
libraryDependencies += "org.pegdown" % "pegdown" % "1.1.0" % "test, it"
libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.1"
libraryDependencies += "com.github.pathikrit" %% "better-files" % "3.8.0"
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
libraryDependencies += "com.amazon.deequ" % "deequ" % "2.0.0-spark-3.1" excludeAll (
ExclusionRule(organization = "org.apache.spark")
)
libraryDependencies += "net.liftweb" %% "lift-json" % "3.4.0"
libraryDependencies += "com.crealytics" %% "spark-excel" % "0.13.1"
The app is building fine after the upgrade but it is unable to write files to the filesystem which was working fine before the upgrade. I havent made any code changes to the write logic.
The relevant portion of code that writes to the files is shown below.
val inputStream = getClass.getResourceAsStream(resourcePath)
val conf = spark.sparkContext.hadoopConfiguration
val fs = FileSystem.get(spark.sparkContext.hadoopConfiguration)
val output = fs.create(new Path(outputPath))
IOUtils.copyBytes(inputStream, output.getWrappedStream, conf, true)
I am wondering if IOUtils is not compatible with the new Scala/Spark versions?

unable to import dependency hadoop-core to build.sbt in intellij

my build.sbt looks like this
name := "Kafak"
version := "1.0"
scalaVersion := "2.12.2"
libraryDependencies += "com.google.code.gson" % "gson" % "2.8.1"
libraryDependencies += "org.apache.kafka" % "kafka-clients" %
"0.10.2.1"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.6.1"
// https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-
client
libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
"3.0.0-alpha3"
// https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-
core
libraryDependencies += "org.apache.hadoop" % "hadoop-core" %
"2.6.0-mr1-cdh5.9.0"
i am getting the following error while importing it

Error when run jar Exception in thread "main" java.lang.NoSuchMethodError scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

I work on spark application using (spark 2.0.0 & scala 2.11.8) and the application works fine within intellij Idea environment. I've extracted application as jar file and tried to run spark application from jar file but this error raised on terminal:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1632)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at Main$.main(Main.scala:26)
at Main.main(Main.scala)
I've read discussions and similar question but all of them talk about different scala versions, however my sbt file is this:
name := "BaiscFM"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.0.0"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.0.0"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.11" % "2.4.17"
libraryDependencies += "net.liftweb" % "lift-json_2.11" % "2.6"
libraryDependencies += "com.typesafe.play" % "play-json_2.11" % "2.4.0-M2"
libraryDependencies += "org.json" % "json" % "20090211"
libraryDependencies += "org.scalaj" % "scalaj-http_2.11" % "2.3.0"
libraryDependencies += "org.drools" % "drools-core" % "6.3.0.Final"
libraryDependencies += "org.drools" % "drools-compiler" % "6.3.0.Final"
How to fix this problem?

Error : java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

I get error, just like the title. I'm already research, and found some similar, but its not working on me.
NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor conflits on Elastic Search jar
Java elasticsearch client always null
https://github.com/elastic/elasticsearch/pull/7593
java.lang.NoSuchMethodError during Elastic search start
https://discuss.elastic.co/t/transportclient-in-2-1-x/38818/6
I'm using Scala as programming language to create API, and Elasticsearch as database.
here is my code build.sbt
name := "LearningByDoing"
version := "1.0"
scalaVersion := "2.10.5"
resolvers += "spray repo" at "http://repo.spray.io"
resolvers += "spray nightlies repo" at "http://nightlies.spray.io"
libraryDependencies += "io.spray" % "spray-json_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-client_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-testkit_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-routing_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-http_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-httpx_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-util_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.12"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.1"
libraryDependencies += "com.sksamuel.elastic4s" % "elastic4s-streams_2.10" % "2.3.1"
libraryDependencies += "org.elasticsearch" % "elasticsearch-mapper-attachments" % "2.3.1"
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.1"
Here is my code plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0-M4")
addSbtPlugin("com.typesafe.sbt" % "sbt-multi-jvm" % "0.3.9")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
at terminal, I was written sbt clean compile test update package and everything works normal. but when I hit the API is always come error like that.
Seems like you have some wrong guava version, just like the firs link you mentioned, may be with this sbt plugin you can see the dependency tree and figured it out some messing dependencies.
The issue is the TCP client for Elasticsearch since 5.0 uses Netty 4.1, which is incompatible with Spray which uses Netty 4. There is no workaround other than waiting for Spray to upgrade or switching to an elasticsearch HTTP client.

scala sbt console to include logback

I am trying to use logback with slf4j in my scala application. I have included the following in build.sbt
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.5"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.1.3"
I tried to invoke sbt console -Dlogback.configurationFile=logback.xml -DLOG_DIR=.
But I am seeing multiple bindings (log4j and logback) in sbt classpath.
SLF4J: Found binding in [jar:file:~/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:~/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
I am not sure how the log4j jar is pulled to the .ivy2 .
build.sbt
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.1"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.2.1"
libraryDependencies += "com.google.protobuf" % "protobuf-java" % "2.5.0"
libraryDependencies += "spark.jobserver" % "job-server-api" % "0.5.0" % "provided"
libraryDependencies += "net.liftweb" %% "lift-json" % "2.5.3"
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "2.0.0"
libraryDependencies += "net.sf.opencsv" % "opencsv" % "2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.5"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.1.3"
Questions:
1) Is the dependency jars in my build.sbt is downloading log4j ?
2) How do I make sure that the slf4j binds with logback only in sbt console?