SBT - Class not found, continuing with a stub - scala

I am currently migrating my Play 2 Scala API project and encounter 10 warnings during the compilations indicating :
[warn] Class play.core.enhancers.PropertiesEnhancer$GeneratedAccessor not found - continuing with a stub.
All of them are the same, and I don't have any other indications. I've searched a bit for other similar cases, it's often because of the JDK version and so on but I'm already in 1.8.
Here's what I have in plugins.sbt :
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.3")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
addSbtPlugin("com.sksamuel.scapegoat" %% "sbt-scapegoat" % "1.0.4")
and in build.sbt :
libraryDependencies ++= Seq(
cache,
ws,
"org.reactivemongo" %% "play2-reactivemongo" % "0.10.5.0.akka23",
"org.reactivemongo" %% "reactivemongo" % "0.10.5.0.akka23",
"org.mockito" % "mockito-core" % "1.10.5" % "test",
"org.scalatestplus" %% "play" % "1.2.0" % "test",
"com.amazonaws" % "aws-java-sdk" % "1.8.3",
"org.cvogt" %% "play-json-extensions" % "0.8.0",
javaCore,
"com.clever-age" % "play2-elasticsearch" % "1.1.0" excludeAll(
ExclusionRule(organization = "org.scala-lang"),
ExclusionRule(organization = "com.typesafe.play"),
ExclusionRule(organization = "org.apache.commons", artifact = "commons-lang3")
)
)
Don't hesitate if you need anything else :)
It's not something that blocks me but I'd prefer avoid these 10 warnings everytime I recompile my application.
Thank you ! :)

It seems something in your code is trying to use the Play enhancer and is failing to find it. Are you using Ebean or something that may require the enhancer?
You can try to add the plugin to your plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-play-enhancer" % "1.1.0")
This should make the warning go away. You can then disable it if you like:
# In build.sbt
playEnhancerEnabled := false

Related

How to specify a different resolver for certain dependencies

I am in a situation where I need to specify a custom resolver for my SBT project, but only to download 1 or 2 dependencies. I want all the other dependencies to be fetched from Maven repository.
Here is my build.sbt file:
...Project definition...
resolvers := Seq(
"Maven" at "https://repo1.maven.org/"
)
//Akka dependencies
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaActorsVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaActorsVersion % Test,
"com.typesafe.akka" %% "akka-stream" % akkaStreamsVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaStreamsVersion % Test,
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % Test,
"com.datastax.cassandra" % "cassandra-driver-core" % "3.3.0",
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"io.spray" %% "spray-json" % "1.3.5",
"de.heikoseeberger" %% "akka-http-circe" % "1.23.0",
"io.circe" %% "circe-generic" % "0.10.0",
"com.pauldijou" %% "jwt-core" % "0.13.0",
"com.pauldijou" %% "jwt-circe" % "0.13.0",
"org.slf4j" % "slf4j-simple" % "1.6.4",
"com.microsoft.azure" % "azure-storage" % "8.4.0",
"com.datastax.cassandra" % "cassandra-driver-extras" % "3.1.4",
"io.jvm.uuid" %% "scala-uuid" % "0.3.0",
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.cassandraunit" % "cassandra-unit" % "3.1.1.0" % "test",
"io.monix" %% "monix" % "3.0.0-8084549",
"org.bouncycastle" % "bcpkix-jdk15on" % "1.48"
)
resolvers := Seq("Artifactory" at "http://10.3.1.6:8081/artifactory/libs-release-local/")
Credentials += Credentials("Artifactory Realm", "10.3.1.6", ARTIFACTORY_USER, ARTIFACTORY_PASSWORD)
libraryDependencies ++=
Seq(
"com.org" % "common-layer_2.11" % "0.3",
)
However the build fails with errors that say that SBT is trying to fetch libraries from Artifactory instead of from Maven.
For example the Cassandra driver dependency
unresolved dependency: com.datastax.cassandra#cassandra-driver-extras;3.1.4: Artifactory: unable to get resource for com/datastax/cassandra#cassandra-driver-extras;3.1.4: res=http://10.3.1.6:8081/artifactory/libs-release-local/com/datastax/cassandra/cassandra-driver-extras/3.1.4/cassandra-driver-extras-3.1.4.pom
I have searched the internet and the documentation and I don't see a clear way to handle this, even though I'm surprised because this seems like a common problem.
Any ideas about how I could enforce the priorities/ordering of resolvers in SBT?
Please note that when you are doing
resolvers := Seq("resolver" at "https://path")
You are overriding the existing user-defined additional resolvers. Therefore if you are doing:
resolvers := Seq("resolver1" at "https://path1")
resolvers := Seq("resolver2" at "https://path2")
You are ending up only with resolver2.
In order to have both resolvers, you need to do something like:
resolvers ++= Seq(
"resolver1" at "https://path1",
"resolver2" at "https://path2"
)
SBT search the dependencies according to the order of the given resolvers. This means that in the given example, it will search first at resolver1, and only if it doesn't find, it will go to resolver2.
Another thing you need to know, is that SBT has predefined resolvers.
You can read more about sbt resolvers at: https://www.scala-sbt.org/1.x/docs/Resolvers.html

sbt different libraryDependencies in test than in normal mode

Because of conflicting / transitive (elasticsearch / lucene / jackrabbit) dependencies i want to have different libraryDependencies in test than i have when normally running the app. I solved it with the setup below, but this requires running activator with -Dtest and this will prevent my app from running normally when i'm done testing. The other way around, i.e. running just activator, will run my app but will not run my test. So, not very convenient and i think this can be done much better (btw i'm very new to sbt/scala)
name := """example"""
version := "0.1"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.1"
// fork in Test := true
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
javaOptions in Test += "-Dlogger.file=conf/test-logger.xml"
// run activator -Dtest
if (sys.props.contains("test")) {
Seq[Project.Setting[_]](
libraryDependencies ++= {
Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"org.webjars" %% "webjars-play" % "2.3.0-2",
"org.webjars" % "bootstrap" % "3.3.6",
"org.webjars" % "font-awesome" % "4.5.0",
"be.objectify" %% "deadbolt-java" % "2.3.3",
"org.apache.lucene" % "lucene-core" % "3.6.0",
"org.elasticsearch" % "elasticsearch" % "1.7.4" exclude("org.apache.lucene", "lucene-core"),
"javax.jcr" % "jcr" % "2.0",
"org.apache.jackrabbit" % "jackrabbit-core" % "2.11.2",
"org.apache.jackrabbit" % "jackrabbit-jcr2dav" % "2.11.2",
"org.apache.tika" % "tika-parsers" % "1.11",
"org.apache.tika" % "tika-core" % "1.11",
"commons-io" % "commons-io" % "2.4",
"com.typesafe.akka" % "akka-testkit_2.11" % "2.3.14" % "test"
)
}
)
} else {
Seq[Project.Setting[_]](
libraryDependencies ++= {
Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"org.webjars" %% "webjars-play" % "2.3.0-2",
"org.webjars" % "bootstrap" % "3.3.6",
"org.webjars" % "font-awesome" % "4.5.0",
"be.objectify" %% "deadbolt-java" % "2.3.3",
"org.elasticsearch" % "elasticsearch" % "1.7.4",
"javax.jcr" % "jcr" % "2.0",
"org.apache.jackrabbit" % "jackrabbit-core" % "2.11.2",
"org.apache.jackrabbit" % "jackrabbit-jcr2dav" % "2.11.2",
"org.apache.tika" % "tika-parsers" % "1.11",
"org.apache.tika" % "tika-core" % "1.11",
"commons-io" % "commons-io" % "2.4",
"com.typesafe.akka" % "akka-testkit_2.11" % "2.3.14" % "test"
)
}
)
}
//.. our private nexus repo left out here
resolvers += "JBoss Repository" at "https://repository.jboss.org/nexus/content/repositories"
resolvers += "JBoss Third-Party Repository" at "https://repository.jboss.org/nexus/content/repositories/thirdparty-releases"
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += Resolver.url("Objectify Play Repository", url("http://deadbolt.ws/releases/"))(Resolver.ivyStylePatterns)
I don't have a setup where I can really test whether this works, but from how I understand sbt dependencies it should:
Dependencies can have a kind of scope called a configuration. Typically, this is used to define test only dependencies:
"com.typesafe.akka" % "akka-testkit_2.11" % "2.3.14" % "test"
But you should also be able to define compile time and run time only dependencies using "compile" and "runtime" instead.
sbt prints me a warning if I use dependencies with different versions. The problem is, that this will use a different version of a dependency to compile it and then to run it with tests. So it will be run against a different version than it was compiled with. There are of course libraries, where this will work, especially, if you run with a newer version that what you use to compile.
If you really need to compile your application twice with different dependencies and use one build for running and one for testing, I fear, there won't be a solution without extending sbt or something like that.
You could try to make two modules, one with the main code and one for testing and then try to cross-build two different versions of the first module. Sbt can easily cross-build over multiple Scala versions, but I don't think it can do it out of the box for multiple versions of a library.
Thanks #dth, you put me on the right track. The settings below worked for me:
libraryDependencies ++= {
Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"org.webjars" %% "webjars-play" % "2.3.0-2",
"org.webjars" % "bootstrap" % "3.3.6",
"org.webjars" % "font-awesome" % "4.5.0",
"be.objectify" %% "deadbolt-java" % "2.3.3",
"org.apache.lucene" % "lucene-core" % "3.6.0" % "compile,test",
"org.elasticsearch" % "elasticsearch" % "1.7.4" % "compile,runtime",
"org.elasticsearch" % "elasticsearch" % "1.7.4" % "test" exclude("org.apache.lucene", "lucene-core"),
"javax.jcr" % "jcr" % "2.0",
"org.apache.jackrabbit" % "jackrabbit-core" % "2.11.2",
"org.apache.jackrabbit" % "jackrabbit-jcr2dav" % "2.11.2",
"org.apache.tika" % "tika-parsers" % "1.11",
"org.apache.tika" % "tika-core" % "1.11",
"commons-io" % "commons-io" % "2.4",
"com.typesafe.akka" % "akka-testkit_2.11" % "2.3.14" % "test"
)
}

Spark application throws javax.servlet.FilterRegistration

I'm using Scala to create and run a Spark application locally.
My build.sbt:
name : "SparkDemo"
version : "1.0"
scalaVersion : "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" exclude("org.apache.hadoop", "hadoop-client")
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0" excludeAll(
ExclusionRule(organization = "org.eclipse.jetty"))
libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
mainClass in Compile := Some("demo.TruckEvents")
During runtime I get the exception:
Exception in thread "main" java.lang.ExceptionInInitializerError
during calling of... Caused by: java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not match
signer information of other classes in the same package
The exception is triggered here:
val sc = new SparkContext("local", "HBaseTest")
I am using the IntelliJ Scala/SBT plugin.
I've seen that other people have also this problem suggestion solution. But this is a maven build... Is my sbt wrong here? Or any other suggestion how I can solve this problem?
See my answer to a similar question here. The class conflict comes about because HBase depends on org.mortbay.jetty, and Spark depends on org.eclipse.jetty. I was able to resolve the issue by excluding org.mortbay.jetty dependencies from HBase.
If you're pulling in hadoop-common, then you may also need to exclude javax.servlet from hadoop-common. I have a working HBase/Spark setup with my sbt dependencies set up as follows:
val clouderaVersion = "cdh5.2.0"
val hadoopVersion = s"2.5.0-$clouderaVersion"
val hbaseVersion = s"0.98.6-$clouderaVersion"
val sparkVersion = s"1.1.0-$clouderaVersion"
val hadoopCommon = "org.apache.hadoop" % "hadoop-common" % hadoopVersion % "provided" excludeAll ExclusionRule(organization = "javax.servlet")
val hbaseCommon = "org.apache.hbase" % "hbase-common" % hbaseVersion % "provided"
val hbaseClient = "org.apache.hbase" % "hbase-client" % hbaseVersion % "provided"
val hbaseProtocol = "org.apache.hbase" % "hbase-protocol" % hbaseVersion % "provided"
val hbaseHadoop2Compat = "org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion % "provided"
val hbaseServer = "org.apache.hbase" % "hbase-server" % hbaseVersion % "provided" excludeAll ExclusionRule(organization = "org.mortbay.jetty")
val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
val sparkStreamingKafka = "org.apache.spark" %% "spark-streaming-kafka" % sparkVersion exclude("org.apache.spark", "spark-streaming_2.10")
If you are using IntelliJ IDEA, try this:
Right click the project root folder, choose Open Module Settings
In the new window, choose Modules in the left navigation column
In the column rightmost, select Dependencies tab, find Maven: javax.servlet:servlet-api:2.5
Finally, just move this item to the bottom by pressing ALT+Down.
It should solve this problem.
This method came from http://wpcertification.blogspot.ru/2016/01/spark-error-class-javaxservletfilterreg.html
If it is happening in Intellij Idea you should go to the project setting and find the jar in the modules, and remove it. Then run your code with sbt through shell. It will get the jar files itself, and then go back to intellij and re-run the code through intellij. It somehow works for me and fixes the error. I am not sure what was the problem since it doesn't show up anymore.
Oh, I also removed the jar file, and added "javax.servlet:javax.servlet-api:3.1.0" through maven by hand and now I can see the error gone.
When you use SBT, FilterRegistration class is present in 3.0 and also if you use JETTY Or Java 8 this JAR 2.5 it automatically adds as dependency,
Fix: Servlet-api-2.5 JAR was the mess there, I resolved this issue by adding servlet-api-3.0 jar in dependencies,
For me works the following:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion.value % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion.value % "provided",
....................................................................
).map(_.excludeAll(ExclusionRule(organization = "javax.servlet")))
If you are running inside intellij, please check in project settings if you have two active modules (one for the project and another for sbt).
Probably a problem while importing existing project.
try running a simple program without the hadoop and hbase dependency
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0" excludeAll(ExclusionRule(organization = "org.eclipse.jetty"))
libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
There should be mismatch of the dependencies. also make sure you have same version of jars while you compile and while you run.
Also is it possible to run the code on spark shell to reproduce ? I will be able to help better.

Why does IntelliJ IDEA debugger jump to wrong Scala version library?

I am using IntelliJ IDEA 13.1.2 with the Scala plugin version 0.36.431 on Windows 7 with sbt 0.13.1.
The following project definition build.sbt has no references to any Scala version other than 2.9.3.
import sbt._
import Keys._
import AssemblyKeys._
import NativePackagerKeys._
name := "simplews"
version := "0.1.0-SNAPSHOT"
val sparkVersion = "0.8.1-incubating"
scalaVersion := "2.9.3"
val akkaVersion = "2.0.5"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-examples_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-tools_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.scalatest" % "scalatest_2.9.3" % "1.9.2" % "test" withSources(),
"org.apache.spark" % "spark-repl_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.kafka" % "kafka" % "0.7.2-spark",
"com.thenewmotion.akka" % "akka-rabbitmq_2.9.2" % "0.0.2" % "compile->default" withSources(),
"com.typesafe.akka" % "akka-actor" % akkaVersion % "compile->default" withSources(),
"com.typesafe.akka" % "akka-testkit" % akkaVersion % "compile->default" withSources(),
"com.rabbitmq" % "amqp-client" % "3.0.1" % "compile->default" withSources(),
"org.specs2" % "specs2_2.9.3" % "1.12.4.1" % "compile->default" withSources(),
"com.nebhale.jsonpath" % "jsonpath" % "1.2" % "compile->default" withSources(),
"org.mockito" % "mockito-all" % "1.8.5",
"junit" % "junit" % "4.11"
)
packagerSettings
packageArchetype.java_application
resolvers ++= Seq(
"Apache repo" at "https://repository.apache.org/content/repositories/releases",
"Cloudera repo" at "https://repository.cloudera.com/artifactory/repo/org/apache/kafka/kafka/0.7.2-spark/",
"akka rabbitmq" at "http://nexus.thenewmotion.com/content/repositories/releases-public",
"Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
Resolver.mavenLocal
)
However as seen in the screenshot the debugger has jumped to scala 2.10.2. Note: the debugger is correctly going to 2.9.3 for some other debugging.
Here is project/plugins.sbt:
resolvers += "sbt-plugins" at "http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-RC2")
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
EDIT In order to reproduce it is necessary to do a mvn local install on one or two libraries that are not available in any public repo.
mvn org.apache.maven.plugins:maven-install-plugin:2.5.1:install-file -Dfile=c:\shared\kafka-0.7.2-spark.jar -DgroupId=org.apache.kafka -DartifactId=kafka -Dversion=0.7.2-spark -Dpackaging=jar
I had in any case not considered someone (om-nom-nom !) would attempt an exact reproduction - so had also omitted otherwise extraneous items like mergeStrategy and assemblyKeys.
A fully independent reproducible setup may be a bit in coming - I have been under rather quite heavy demands here.

SBT: libraryDependencies missing from classpath of jetty-run

I have set up a webapp project with sbt 0.10.1. One of the library dependencies is Jersey. My build.sbt file looks as follows:
seq(webSettings :_*)
scalaVersion := "2.8.1"
libraryDependencies ++= Seq(
"javax.ws.rs" % "jsr311-api" % "1.1" % "provided, jetty",
"com.sun.jersey" % "jersey-server" % "1.9" % "provided, jetty" from "http://download.java.net/maven/2/",
"org.eclipse.jetty" % "jetty-webapp" % "7.3.0.v20110203" % "jetty",
"ch.qos.logback" % "logback-classic" % "0.9.26",
"org.eclipse.jetty" % "jetty-servlet" % "7.3.0.v20110203"
)
On the sbt console I run reload, update, compile, prepare-webapp, jetty-run - in that order.
Everything seems to be ok, except jetty-run. There I get a ClassNotFoundException
java.lang.ClassNotFoundException: com.sun.jersey.spi.container.servlet.ServletContainer
This is because the Jersey library is not copied into target/webapp/WEB-INF/lib/ during jetty-run. So I guess there must be some flaw in my setup of build.sbt.
Does anyone have an idea what could be wrong here?
Thank you very much in advance!
Michael
You have jersey dependency,
"com.sun.jersey" % "jersey-server" % "1.9" % "provided, jetty"
from "http://download.java.net/maven/2/"
but when you check the directory under the path you've provided,
http://download.java.net/maven/2/com/sun/jersey/jersey-server/
you find that there isn't 1.9 folder, but 1.9-SNAPSHOT is. I didn't try it, but probably that should help.
It turned out that there were multiple issues with the build.sbt file.
I have modified the file as follows to get it working:
libraryDependencies ++= Seq(
"javax.ws.rs" % "jsr311-api" % "1.1" % "provided, jetty",
"com.sun.jersey" % "jersey-server" % "1.8" from "http://download.java.net/maven/2/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar",
"com.sun.jersey" % "jersey-core" % "1.8" from "http://download.java.net/maven/2/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar",
"org.eclipse.jetty" % "jetty-webapp" % "7.3.0.v20110203" % "jetty",
"ch.qos.logback" % "logback-classic" % "0.9.26",
"org.eclipse.jetty" % "jetty-servlet" % "7.3.0.v20110203",
"asm" % "asm" % "3.1"
)
One important thing to note is that I had to remove the "provided, jetty" part from the jersey dependencies. Otherwise they would not be copied to target/webapp/WEB-INF/lib during the run of "prepare webapp".