How to use playframework 2.3 with specs2 2.4 instead of specs2 2.3.x - specs2

Recently, specs2 was updated to version 2.4, which uses scalaz 7.1 instead of 7.0.x now. Once I update my specs2 dependency in my play! 2.3 project to use version 2.4, all tests fail with the following exception:
[error] Uncaught exception when running ...Spec: java.lang.In
compatibleClassChangeError: Found class scalaz.syntax.FunctorOps, but interface
was expected
sbt.ForkMain$ForkError: Found class scalaz.syntax.FunctorOps, but interface was
expected
at org.specs2.specification.SpecificationStructure$.createSpecificationEither(BaseSpecification.scala:119)
at org.specs2.runner.SbtRunner.org$specs2$runner$SbtRunner$$specificationRun(SbtRunner.scala:73)
at org.specs2.runner.SbtRunner$$anonfun$newTask$1$$anon$5.execute(SbtRunner.scala:59)
at sbt.ForkMain$Run$2.call(ForkMain.java:294)
at sbt.ForkMain$Run$2.call(ForkMain.java:284)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Nobody seems to have had this error before. At least I was unable to find it in the issue tracking systems of the specs2 and play project.

I make it working in Play 2.3.8 with this settings.
"org.scalaz" %% "scalaz-core" % "7.1.1",
"com.typesafe.play" %% "play-test" % "2.3.8" % "test" excludeAll(
ExclusionRule(organization = "org.specs2")
),
"org.specs2" %% "specs2-core" % "3.5" % "test",
"org.specs2" %% "specs2-junit" % "3.5" % "test",
"org.specs2" %% "specs2-mock" % "3.5" % "test"

"com.typesafe.play" %% "play-test" % "2.3.3" depends on specs2 2.3.12, and specs2 2.3.12 depends on scalaz 7.0.6
https://github.com/playframework/playframework/blob/2.3.3/framework/project/Dependencies.scala#L9-L15
https://github.com/playframework/playframework/blob/2.3.3/framework/project/Dependencies.scala#L182
https://github.com/playframework/playframework/blob/2.3.3/framework/project/Build.scala#L276
You can/should not use these together. because scalaz 7.0.6 and 7.1.0 are binary incompatible.
if you want to use play2 and scalaz 7.1 together, I think there are some solutions
exclude "play-test" dependency libraryDependencies ~= { _.filterNot(m => m.organization == "com.typesafe.play" && m.name == "play-test") }
wait for play2.4 https://github.com/playframework/playframework/pull/3330
rebuild "play-test" module with scalaz 7.1 https://github.com/playframework/playframework/tree/2.3.3/framework/src/play-test/src/main/scala/play/api/test

Related

upgrade from Scala 2.11.8 to 2.12.10 build fails at sbt due to conflicting cross-version suffixes

I am trying to upgrade my Scala version from 2.11.8 to 2.12.10. I made following changes in my sbt file.
"org.apache.spark" %% "spark-core" % "2.4.7" % "provided",
"org.apache.spark" %% "spark-sql" % "2.4.7" % "provided",
"com.holdenkarau" %% "spark-testing-base" % "3.1.2_1.1.0" % "test"
when I am build the sbt file, I am getting following error
[error] Modules were resolved with conflicting cross-version suffixes in ProjectRef(uri("file:/Users/user/IdeaProjects/project/"), "root"):
[error] io.reactivex:rxscala _2.12, _2.11
Tried following possible ways. but no luck.
("io.reactivex" % "rxscala_2.12" % "0.27.0").force().exclude("io.reactivex","rxscala_2.11")
2.Removed Scala verion=2.11.8 from File->project structure->global libraries.
Any Help will be very useful.
It would be best if you could post your entire build.sbt file so it could be reproduced, but as a starting point, the dependency spark-testing-base is pointing to the wrong Spark version. From the documentation:
So you include com.holdenkarau.spark-testing-base [spark_version]_1.0.0 and extend
Based on the information you have provided you should be using:
"com.holdenkarau" %% "spark-testing-base" % "2.4.7_1.1.0" % "test"

Cannot find ScalaCheckDrivenPropertyChecks

I upgraded my test dependencies for my play project and now I get this problem:
object scalacheck is not a member of package org.scalatestplus
[error] import org.scalatestplus.scalacheck.ScalaCheckDrivenPropertyChecks
These are my test dependencies:
"org.scalatest" %% "scalatest" % "3.2.3" % Test,
"org.scalamock" %% "scalamock" % "4.2.0" % "test",
"com.github.alexarchambault" %% "scalacheck-shapeless_1.14" % "1.2.1" % "test",
"org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % "test, it",
Do I have some incompatible versioning going on here?
You don't have the right dependency:
"org.scalatestplus" %% "scalacheck-1-15" % "3.2.3.0"
Since you are using scalatest version 3.2.3, you should use scalacheck version 3.2.3.X, for compatability.
Here is its maven location, and this is its github repo.
Scala 2.13 Code run at Scastie.
Scala 2.12 Code run at Scastie.

SBT with Spark-Core 3.0.1 Exception NoClassDefFound for org/apache/log4j/Logger

I am using Log4j with Scala 2.12.12 and Spark-Core 3.0.1 but when I change the library dependencies to not package spark-core in the fat jar I get the following error when I try to run it:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Logger
at com.some.package.name.Utils$.setup(Utils.scala:207)
at com.some.package.name.Main$.main(Main.scala:9)
at com.some.package.name.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
The compilation is successful and if I remove the provided clause from the dependencies line everything works fine. My build.sbt is as follows:
scalaVersion := "2.12.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.1" % "provided",
"org.apache.logging.log4j" % "log4j-api" % "2.13.3",
"org.apache.logging.log4j" % "log4j-core" % "2.13.3",
"org.scalatest" %% "scalatest" % "3.2.0" % "test",
"com.holdenkarau" %% "spark-testing-base" % "3.0.1_1.0.0" % Test)
If I remove my code which writes to the logger the setup for SparkContext is indicated as the line from which the error originates.
I am having a similar problem when trying to run my application on EMR 6.2.
I have found a comment on qubole github project claiming AWS' spark-core JAR is missing org/apache/spark/internal/Logging$class, but I have no way to verify if this is true.

How should I log from my custom Spark JAR

Scala/JVM noob here that wants to understand more about logging, specifically when using Apache Spark.
I have written a library in Scala that depends upon a bunch of Spark libraries, here are my dependencies:
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark,
"org.apache.spark" %% "spark-sql" % Version.spark,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
(admittedly I copied a lot of this from elsewhere).
I am able to package my code as a JAR using sbt package which I can then call from Spark by placing the JAR into ${SPARK_HOME}/jars. This is working great.
I now want to implement logging from my code so I do this:
import com.typesafe.scalalogging.Logger
/*
* stuff stuff stuff
*/
val logger : Logger = Logger("name")
logger.info("stuff")
however when I try and call my library (which I'm doing from Python, not that I think that's relevant here) I get an error:
py4j.protocol.Py4JJavaError: An error occurred while calling z:com.company.package.class.function.
E : java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger$
Clearly this is because com.typesafe.scala-logging library is not in my JAR. I know I could solve this by packaging using sbt assembly but I don't want to do that because it will include all the other dependencies and cause my JAR to be enormous.
Is there a way to selectively include libraries (com.typesafe.scala-logging in this case) in my JAR? Alternatively, should I be attempting to log using another method, perhaps using a logger that is included with Spark?
Thanks to pasha701 in the comments I attempted packaging my dependencies by using sbt assembly rather than sbt package.
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark % Provided,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark % Provided,
"org.apache.spark" %% "spark-sql" % Version.spark % Provided,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
Unfortunately, even if specifying the spark dependencies as Provided my JAR went from 324K to 12M hence I opted to use println() instead. Here is my commit message:
log using println
I went with the println option because it keeps the size of the JAR small.
I trialled use of com.typesafe.scalalogging.Logger but my tests failed with error:
java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger
because that isn't provided with Spark. I attempted to use sbt assembly
instead of sbt package but this caused the size of the JAR to go from
324K to 12M, even with spark dependencies set to Provided. A 12M JAR
isn't worth the trade-off just to use scalaLogging, hence using println
instead.
I note that pasha701 suggested using log4j instead as that is provided with Spark so I shall try that next. Any advice on using log4j from Scala when writing a Spark library would be much appreciated.
As you said 'sbt assembly' will include all the dependencies into your jar.
If you want use certain two option:
Download logback-core and logback-classic and add them on --jar spark2-submit command
Specify the above deps in --packages spark2-submit option

can`t import kamon-play-26 using SBT

I updated my play to 2.6.0. I have a kamon dependency but sbt can't resolve this dependency.
Did anyone encounter this problem too?
Below is my libraryDependencies in the build.sbt:
libraryDependencies +=
Seq(
ws,
"com.google.inject" % "guice" % "3.0",
"com.typesafe.play" %% "play-json" % "2.6.0",
"io.kamon" %% "kamon-play-26" % "0.6.7"
)
But I get a below error as kamon-play-26 not found...
Kamon for Play 2.6 is available for Scala 2.11 and 2.12 with:
"io.kamon" %% "kamon-play-2.6" % "0.6.8"
Note the period in 2.6.
Searching through the kamon repositories in maven reveals that there is no kamon-play-26 package.
The github page https://github.com/kamon-io/kamon-play indicates that it does exist however. Perhaps its been pulled because the build is failing. Compile your own package from source, perhaps?