scala sbt console to include logback - scala

I am trying to use logback with slf4j in my scala application. I have included the following in build.sbt
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.5"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.1.3"
I tried to invoke sbt console -Dlogback.configurationFile=logback.xml -DLOG_DIR=.
But I am seeing multiple bindings (log4j and logback) in sbt classpath.
SLF4J: Found binding in [jar:file:~/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:~/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
I am not sure how the log4j jar is pulled to the .ivy2 .
build.sbt
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.1"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.2.1"
libraryDependencies += "com.google.protobuf" % "protobuf-java" % "2.5.0"
libraryDependencies += "spark.jobserver" % "job-server-api" % "0.5.0" % "provided"
libraryDependencies += "net.liftweb" %% "lift-json" % "2.5.3"
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "2.0.0"
libraryDependencies += "net.sf.opencsv" % "opencsv" % "2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.5"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.1.3"
Questions:
1) Is the dependency jars in my build.sbt is downloading log4j ?
2) How do I make sure that the slf4j binds with logback only in sbt console?

Related

Unable to write files after scala and spark upgrade

My project was previously using Scala version 2.11.12 which I have upgraded to 2.12.10 and the Spark version has been upgraded from 2.4.0 to 3.1.2. See the build.sbt file below with the rest of the project dependencies and versions:
scalaVersion := "2.12.10"
val sparkVersion = "3.1.2"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-hive" % sparkVersion % "provided"
libraryDependencies += "org.xerial.snappy" % "snappy-java" % "1.1.4"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.8"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % "test, it"
libraryDependencies += "com.holdenkarau" %% "spark-testing-base" % "3.1.2_1.1.0" % "test, it"
libraryDependencies += "com.github.pureconfig" %% "pureconfig" % "0.12.1"
libraryDependencies += "com.typesafe" % "config" % "1.3.2"
libraryDependencies += "org.pegdown" % "pegdown" % "1.1.0" % "test, it"
libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.1"
libraryDependencies += "com.github.pathikrit" %% "better-files" % "3.8.0"
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
libraryDependencies += "com.amazon.deequ" % "deequ" % "2.0.0-spark-3.1" excludeAll (
ExclusionRule(organization = "org.apache.spark")
)
libraryDependencies += "net.liftweb" %% "lift-json" % "3.4.0"
libraryDependencies += "com.crealytics" %% "spark-excel" % "0.13.1"
The app is building fine after the upgrade but it is unable to write files to the filesystem which was working fine before the upgrade. I havent made any code changes to the write logic.
The relevant portion of code that writes to the files is shown below.
val inputStream = getClass.getResourceAsStream(resourcePath)
val conf = spark.sparkContext.hadoopConfiguration
val fs = FileSystem.get(spark.sparkContext.hadoopConfiguration)
val output = fs.create(new Path(outputPath))
IOUtils.copyBytes(inputStream, output.getWrappedStream, conf, true)
I am wondering if IOUtils is not compatible with the new Scala/Spark versions?

scala integration tests "No such setting/task"

Integration test configuration is new to me.
I cannot get my scalatest integration tests to run (on sbt or intellij).
My unittests in src/test/scala run fine.
My integration tests are in src/it/scala
If I run with sbt it:test the error is "No such setting/task"
If I run on intellij (i.e., with the 'run' button), I get
Unable to load a Suite class. This could be due to an error in your runpath. Missing class: xxx.tools.es_ingester.EsIntegrationSpec
java.lang.ClassNotFoundException: xxx.tools.es_ingester.ConfluenceEsIntegrationSpec
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
The class, however, is clearly in /src/it/scala/xxx/tools/es_ingester.
update: build.sbt
name := "xxx.tools.data_extractor"
version := "0.1"
organization := "xxx.tools"
scalaVersion := "2.11.12"
sbtVersion := "1.2.7"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.5"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.25"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.2"
libraryDependencies += "commons-io" % "commons-io" % "2.6"
libraryDependencies += "org.bouncycastle" % "bcprov-jdk15on" % "1.61"
libraryDependencies += "org.mockito" % "mockito-core" % "2.24.0" % Test
libraryDependencies += "com.typesafe" % "config" % "1.3.3"
libraryDependencies += "com.typesafe.play" %% "play" % "2.7.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-client" % "6.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "6.6.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-high-level-client" % "6.6.0"
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.3"
You have not added configuration for integration test.
For example, adding it in scala test or default settings etc.
"org.scalatest" %% "scalatest" % "3.0.5" % "it, test"
For all integration, settings refer
I hope, it will help.

Error : java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

I get error, just like the title. I'm already research, and found some similar, but its not working on me.
NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor conflits on Elastic Search jar
Java elasticsearch client always null
https://github.com/elastic/elasticsearch/pull/7593
java.lang.NoSuchMethodError during Elastic search start
https://discuss.elastic.co/t/transportclient-in-2-1-x/38818/6
I'm using Scala as programming language to create API, and Elasticsearch as database.
here is my code build.sbt
name := "LearningByDoing"
version := "1.0"
scalaVersion := "2.10.5"
resolvers += "spray repo" at "http://repo.spray.io"
resolvers += "spray nightlies repo" at "http://nightlies.spray.io"
libraryDependencies += "io.spray" % "spray-json_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-client_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-testkit_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-routing_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-http_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-httpx_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-util_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.12"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.1"
libraryDependencies += "com.sksamuel.elastic4s" % "elastic4s-streams_2.10" % "2.3.1"
libraryDependencies += "org.elasticsearch" % "elasticsearch-mapper-attachments" % "2.3.1"
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.1"
Here is my code plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0-M4")
addSbtPlugin("com.typesafe.sbt" % "sbt-multi-jvm" % "0.3.9")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
at terminal, I was written sbt clean compile test update package and everything works normal. but when I hit the API is always come error like that.
Seems like you have some wrong guava version, just like the firs link you mentioned, may be with this sbt plugin you can see the dependency tree and figured it out some messing dependencies.
The issue is the TCP client for Elasticsearch since 5.0 uses Netty 4.1, which is incompatible with Spray which uses Netty 4. There is no workaround other than waiting for Spray to upgrade or switching to an elasticsearch HTTP client.

Why does IntelliJ IDEA debugger jump to wrong Scala version library?

I am using IntelliJ IDEA 13.1.2 with the Scala plugin version 0.36.431 on Windows 7 with sbt 0.13.1.
The following project definition build.sbt has no references to any Scala version other than 2.9.3.
import sbt._
import Keys._
import AssemblyKeys._
import NativePackagerKeys._
name := "simplews"
version := "0.1.0-SNAPSHOT"
val sparkVersion = "0.8.1-incubating"
scalaVersion := "2.9.3"
val akkaVersion = "2.0.5"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-examples_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-tools_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.scalatest" % "scalatest_2.9.3" % "1.9.2" % "test" withSources(),
"org.apache.spark" % "spark-repl_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.kafka" % "kafka" % "0.7.2-spark",
"com.thenewmotion.akka" % "akka-rabbitmq_2.9.2" % "0.0.2" % "compile->default" withSources(),
"com.typesafe.akka" % "akka-actor" % akkaVersion % "compile->default" withSources(),
"com.typesafe.akka" % "akka-testkit" % akkaVersion % "compile->default" withSources(),
"com.rabbitmq" % "amqp-client" % "3.0.1" % "compile->default" withSources(),
"org.specs2" % "specs2_2.9.3" % "1.12.4.1" % "compile->default" withSources(),
"com.nebhale.jsonpath" % "jsonpath" % "1.2" % "compile->default" withSources(),
"org.mockito" % "mockito-all" % "1.8.5",
"junit" % "junit" % "4.11"
)
packagerSettings
packageArchetype.java_application
resolvers ++= Seq(
"Apache repo" at "https://repository.apache.org/content/repositories/releases",
"Cloudera repo" at "https://repository.cloudera.com/artifactory/repo/org/apache/kafka/kafka/0.7.2-spark/",
"akka rabbitmq" at "http://nexus.thenewmotion.com/content/repositories/releases-public",
"Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
Resolver.mavenLocal
)
However as seen in the screenshot the debugger has jumped to scala 2.10.2. Note: the debugger is correctly going to 2.9.3 for some other debugging.
Here is project/plugins.sbt:
resolvers += "sbt-plugins" at "http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-RC2")
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
EDIT In order to reproduce it is necessary to do a mvn local install on one or two libraries that are not available in any public repo.
mvn org.apache.maven.plugins:maven-install-plugin:2.5.1:install-file -Dfile=c:\shared\kafka-0.7.2-spark.jar -DgroupId=org.apache.kafka -DartifactId=kafka -Dversion=0.7.2-spark -Dpackaging=jar
I had in any case not considered someone (om-nom-nom !) would attempt an exact reproduction - so had also omitted otherwise extraneous items like mergeStrategy and assemblyKeys.
A fully independent reproducible setup may be a bit in coming - I have been under rather quite heavy demands here.

IDEA and scalariform configuration - unresolved symbols even if scalariform works from command line

This question is partially related to sbt-scalariform plugin - can't resolve settings. I managed to run scalariform from command line as SBT task.
Now the problem is with IDEA. When I open my build.sbt, which looks like this:
import scalariform.formatter.preferences._
name := """scheduling-backend"""
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "spray repo" at "http://repo.spray.io"
resolvers += "spray nightlies" at "http://nightlies.spray.io"
resolvers += "SpringSource Milestone Repository" at "http://repo.springsource.org/milestone"
resolvers += "Neo4j Cypher DSL Repository" at "http://m2.neo4j.org/content/repositories/releases"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.0",
"com.typesafe.akka" %% "akka-slf4j" % "2.3.0",
"com.typesafe.akka" %% "akka-testkit" % "2.3.0" % "test",
"com.typesafe.akka" %% "akka-persistence-experimental" % "2.3.0",
"io.spray" % "spray-can" % "1.3.0",
"io.spray" % "spray-routing" % "1.3.0",
"io.spray" % "spray-testkit" % "1.3.0" % "test",
"io.spray" %% "spray-json" % "1.2.5",
"ch.qos.logback" % "logback-classic" % "1.0.13",
"org.specs2" %% "specs2" % "1.14" % "test",
"org.springframework.scala" % "spring-scala" % "1.0.0.M2",
"org.springframework.data" % "spring-data-neo4j" % "3.0.0.RELEASE",
"org.springframework.data" % "spring-data-neo4j-rest" % "3.0.0.RELEASE",
"javax.validation" % "validation-api" % "1.1.0.Final",
"com.github.nscala-time" %% "nscala-time" % "0.8.0",
"org.neo4j" % "neo4j-kernel" % "2.0.1" % "test" classifier "tests",
"com.sun.jersey" % "jersey-core" % "1.9",
"org.mockito" % "mockito-all" % "1.9.5"
)
scalacOptions ++= Seq(
"-unchecked",
"-deprecation",
"-Xlint",
"-Ywarn-dead-code",
"-language:_",
"-target:jvm-1.7",
"-encoding", "UTF-8"
)
org.scalastyle.sbt.ScalastylePlugin.Settings
scalariformSettings
ScalariformKeys.preferences := ScalariformKeys.preferences.value
.setPreference(AlignParameters, true)
.setPreference(CompactControlReadability, true)
IDEA reports problems with my file.
I am getting Cannot resolve symbol scalariformSettings and Cannot resolve symbol ScalariformKeyseven if I everything works from terminal.
adding import com.typesafe.sbt.SbtScalariform._ to build.sbt seems to fix the error on 13.1.1 with scala plugin 0.33.403, but I have to admit it ignored the import at first and then randomly started to see it.