Is it possible to change the order of imports in sbt for scala compilation?
I have two unmanaged dependencies in the ./lib/ folder and the rest are managed dependencies in the sbt file:
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.10",
"org.slf4j" % "slf4j-simple" % "1.7.10",
"org.slf4j" % "slf4j-log4j12" % "1.7.10",
"com.typesafe" % "config" % "1.0.2",
"edu.washington.cs.knowitall.taggers" %% "taggers-core" % "0.4",
"com.rockymadden.stringmetric" % "stringmetric-core" % "0.25.3",
"org.apache.solr" % "solr-solrj" % "4.3.1",
"com.twitter" %% "util-collection" % "6.3.6",
"org.scalaj" %% "scalaj-http" % "0.3.10",
"commons-logging" % "commons-logging" % "1.2"
)
In Eclipse I can run my program, because I can change the order of imports in the java build path (I put unmanaged dependencies at the end).
However when I want to run it from the terminal:
sbt "run-main de.questionParser.Test"
I get the following error
[error] Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Lists.reverse(Ljava/util/List;)Ljava/util/List;
So the final question is:
Is it possible to change the order of imports in sbt for scala compilation so that managed dependencies are included before unmanaged dependencies?
Related
Scala/JVM noob here that wants to understand more about logging, specifically when using Apache Spark.
I have written a library in Scala that depends upon a bunch of Spark libraries, here are my dependencies:
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark,
"org.apache.spark" %% "spark-sql" % Version.spark,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
(admittedly I copied a lot of this from elsewhere).
I am able to package my code as a JAR using sbt package which I can then call from Spark by placing the JAR into ${SPARK_HOME}/jars. This is working great.
I now want to implement logging from my code so I do this:
import com.typesafe.scalalogging.Logger
/*
* stuff stuff stuff
*/
val logger : Logger = Logger("name")
logger.info("stuff")
however when I try and call my library (which I'm doing from Python, not that I think that's relevant here) I get an error:
py4j.protocol.Py4JJavaError: An error occurred while calling z:com.company.package.class.function.
E : java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger$
Clearly this is because com.typesafe.scala-logging library is not in my JAR. I know I could solve this by packaging using sbt assembly but I don't want to do that because it will include all the other dependencies and cause my JAR to be enormous.
Is there a way to selectively include libraries (com.typesafe.scala-logging in this case) in my JAR? Alternatively, should I be attempting to log using another method, perhaps using a logger that is included with Spark?
Thanks to pasha701 in the comments I attempted packaging my dependencies by using sbt assembly rather than sbt package.
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark % Provided,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark % Provided,
"org.apache.spark" %% "spark-sql" % Version.spark % Provided,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
Unfortunately, even if specifying the spark dependencies as Provided my JAR went from 324K to 12M hence I opted to use println() instead. Here is my commit message:
log using println
I went with the println option because it keeps the size of the JAR small.
I trialled use of com.typesafe.scalalogging.Logger but my tests failed with error:
java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger
because that isn't provided with Spark. I attempted to use sbt assembly
instead of sbt package but this caused the size of the JAR to go from
324K to 12M, even with spark dependencies set to Provided. A 12M JAR
isn't worth the trade-off just to use scalaLogging, hence using println
instead.
I note that pasha701 suggested using log4j instead as that is provided with Spark so I shall try that next. Any advice on using log4j from Scala when writing a Spark library would be much appreciated.
As you said 'sbt assembly' will include all the dependencies into your jar.
If you want use certain two option:
Download logback-core and logback-classic and add them on --jar spark2-submit command
Specify the above deps in --packages spark2-submit option
I'm trying to build an Scalatra application that runs code with spark. I can actually build the fat jar with sbt-assembly and the endpoints work, but when running tests with org.scalatra.test.scalatest._ I get the following error:
*** RUN ABORTED ***
java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.<clinit>(SSLConnectionSocketFactory.java:146)
at org.apache.http.impl.client.HttpClientBuilder.build(HttpClientBuilder.java:964)
at org.scalatra.test.HttpComponentsClient$class.createClient(HttpComponentsClient.scala:100)
at my.package.MyServletTests.createClient(MyServletTests.scala:5)
at org.scalatra.test.HttpComponentsClient$class.submit(HttpComponentsClient.scala:63)
at my.package.MyServletTests.submit(MyServletTests.scala:5)
at org.scalatra.test.Client$class.post(Client.scala:62)
at my.package.MyServletTests.post(MyServletTests.scala:5)
at org.scalatra.test.Client$class.post(Client.scala:60)
at my.package.MyServletTests.post(MyServletTests.scala:5)
...
From other sources, this seemed to be an httpclient version error, since both Scalatra and Spark use different versions. These sources suggested the use of Maven Shade Plugin to rename one of these versions. I am, however, using sbt instead of Maven. Even though sbt has a shade functionality, it works when creating the fat jar, and I need this solution during development tests.
Sources are:
Conflict between httpclient version and Apache Spark
Error while trying to use an API. java.lang.NoSuchFieldError: INSTANCE
Is there any way to resolve this kind of conflict using SBT? I'm running Eclipse Scala-IDE and these are my dependencies in built.sbt:
val scalatraVersion = "2.6.5"
// scalatra
libraryDependencies ++= Seq(
"org.scalatra" %% "scalatra" % scalatraVersion,
"org.scalatra" %% "scalatra-scalatest" % scalatraVersion % "test",
"org.scalatra" %% "scalatra-specs2" % scalatraVersion,
"org.scalatra" %% "scalatra-swagger" % scalatraVersion,
"ch.qos.logback" % "logback-classic" % "1.2.3" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.4.9.v20180320" % "container;compile",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided"
)
// From other projects:
// spark
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.4.0",
"org.apache.spark" %% "spark-mllib" % "2.4.0",
"org.apache.spark" %% "spark-sql" % "2.4.0"
)
// scalatest
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.5" % "test"
I am trying to set up IntelliJ IDEA with sbt for a Scala project. The external dependencies are specified in my build.sbt and also listed inside the IDE, as can be seen in the screenshot. However, I still get compiler errors that saying the respective symbol cannot be resolved. Can anyone point me into the right direction?
Contents of my build.sbt:
lazy val midas = (project in file("."))
.settings(
name := "test",
mainClass in assembly := Some("core.Service"),
assemblyJarName in assembly := "test.jar",
test in assembly := {},
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.25",
"com.typesafe.akka" %% "akka-actor" % "2.5.13",
"com.typesafe.akka" %% "akka-slf4j" % "2.5.13",
"com.typesafe.akka" %% "akka-remote" % "2.5.13",
"org.scala-lang.modules" %% "scala-xml" % "1.1.0",
"com.typesafe.play" %% "play-json" % "2.6.9",
"com.typesafe.slick" %% "slick" % "3.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.9.0",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"ch.qos.logback" % "logback-core" % "1.2.3",
"com.mchange" % "c3p0" % "0.9.5.2",
"joda-time" % "joda-time" % "2.10",
"org.joda" % "joda-convert" % "2.0.1",
"net.sourceforge.jtds" % "jtds" % "1.3.1"
)
)
Still the dependencies for both, akka and joda-time cannot be resolved inside the IDE. The sbt compile from the command line works fine however.
joda-convert does not include org.joda.time.DateTime - you need "joda-time" % "joda-time" % "2.9.4" or so.
Once you have added this (and fixed your missing Akka import, maybe "com.typesafe.akka" %% "akka-actor" % "2.4.17"), you need to refresh the sbt project.
I press Shift, Shift to bring up "Search Everywhere" and type "sbt". Choose sbt under Tool Windows, then click "Refresh all sbt projects" (blue reload icon) in the sbt window. I close this window because it is not usually useful.
Hope that helps.
I'm using Typesafe Activator's last version (1.2.8 with Scala 2.11.x).
When I add a dependency "org.apache.httpcomponents" %% "httpclient" % "4.4-alpha1" to my project in build.sbt, something like this:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.4",
"com.typesafe.akka" %% "akka-testkit" % "2.3.4",
"org.scalatest" %% "scalatest" % "2.1.6" % "test",
"junit" % "junit" % "4.11" % "test",
"com.novocode" % "junit-interface" % "0.10" % "test",
"org.apache.httpcomponents" %% "httpclient" % "4.4-alpha1" // my added dependency
)
... and try to update the project ( in activator's cli ) I get an error:
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.httpcomponents#httpclient_2.11;4.4-alpha1: not found
I know scala's version aren't binary compatible, but I try to get a pure-java library org.apache.httpcomponent#httpclient! Why does activator add "_2.11" at the end of artifactId and make wrong urls...? How to resolve it?
Change the first %% to a single %. The double character version is for fetching cross-built libraries, and yours isn't.
I've a problem building a module into a project with sbt 0.13.5 and scala 2.9.3, I have the dependency defined to
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.2.3" % "provided"
and even when the scalaVersion is explicitly set, throws this error:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/Users/me/Code/project/}module:
[error] com.fasterxml.jackson.module:jackson-module-scala _2.9.2, _2.9.3
This is a provided dependency, that works on other module of the same project with the same resolvers and the same explicit scala version, the only difference there is in the additional dependencies, failing project has
"bouncycastle" % "bcprov-jdk16" % "140" % "provided",
"com.jolbox" % "bonecp" % "0.7.1.RELEASE" % "provided",
"mysql" % "mysql-connector-java" % "5.1.18" % "provided",
"org.liquibase" % "liquibase-core" % "3.0.5" % "provided",
"org.jdbi" % "jdbi" % "2.51" % "provided",
"javax.mail" % "mail" % "1.4" % "provided",
"com.twitter" % "finagle-redis" % "6.6.2" % "provided",
"com.twitter" % "finatra" % "1.3.9" % "provided"
Any idea on how to fix this? It appeared when updated to sbt 0.13, in 0.12 works ok.
Ok, I see what the problem is, finatra 1.3.9 is compiled with scala 2.9.2 and there's a dependency on fasterxml, Tried to fix by cross version mapping but it's a nighmare of dependencies, so, upgrading all dependencies to 2.10 is working so far.