spark test on local machine - scala

I am running unit tests on Spark 1.3.1 with sbt test and besides the unit tests being incredibly slow I keep running into java.lang.ClassNotFoundException: org.apache.spark.storage.RDDBlockId issues. Usually this means a dependency issue, but I wouldn't know from where. Tried installing everything on a new machine, including fresh hadoop, fresh ivy2, but I still run into the same issue
Any help is greatly appreciated
Exception:
Exception in thread "Driver Heartbeater" java.lang.ClassNotFoundException:
org.apache.spark.storage.RDDBlockId
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
My build.sbt:
libraryDependencies ++= Seq(
"org.scalaz" %% "scalaz-core" % "7.1.2" excludeAll ExclusionRule(organization = "org.slf4j"),
"com.typesafe.play" %% "play-json" % "2.3.4" excludeAll ExclusionRule(organization = "org.slf4j"),
"org.apache.spark" %% "spark-core" % "1.3.1" % "provided" withSources() excludeAll (ExclusionRule(organization = "org.slf4j"), ExclusionRule("org.spark-project.akka", "akka-actor_2.10")),
"org.apache.spark" %% "spark-graphx" % "1.3.1" % "provided" withSources() excludeAll (ExclusionRule(organization = "org.slf4j"), ExclusionRule("org.spark-project.akka", "akka-actor_2.10")),
"org.apache.cassandra" % "cassandra-all" % "2.1.6",
"org.apache.cassandra" % "cassandra-thrift" % "2.1.6",
"com.typesafe.akka" %% "akka-actor" % "2.3.11",
"com.datastax.cassandra" % "cassandra-driver-core" % "2.1.6" withSources() withJavadoc() excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"com.github.nscala-time" %% "nscala-time" % "1.2.0" excludeAll ExclusionRule(organization = "org.slf4j") withSources(),
"com.datastax.spark" %% "spark-cassandra-connector-embedded" % "1.3.0-M2" excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M2" excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"org.slf4j" % "slf4j-api" % "1.6.1",
"com.twitter" % "jsr166e" % "1.1.0",
"org.slf4j" % "slf4j-nop" % "1.6.1" % "test",
"org.scalatest" %% "scalatest" % "2.2.1" % "test" excludeAll ExclusionRule(organization = "org.slf4j")
)
and my spark test settings (of which I have disabled all to test it)
(spark.kryo.registrator,com.my.spark.MyRegistrator)
(spark.eventLog.dir,)
(spark.driver.memory,16G)
(spark.kryoserializer.buffer.mb,512)
(spark.akka.frameSize,5)
(spark.shuffle.spill,false)
(spark.default.parallelism,8)
(spark.shuffle.consolidateFiles,false)
(spark.serializer,org.apache.spark.serializer.KryoSerializer)
(spark.shuffle.spill.compress,false)
(spark.driver.host,10.10.68.66)
(spark.akka.timeout,300)
(spark.driver.port,55328)
(spark.eventLog.enabled,false)
(spark.cassandra.connection.host,127.0.0.1)
(spark.cassandra.connection.ssl.enabled,false)
(spark.master,local[8])
(spark.cassandra.connection.ssl.trustStore.password,password)
(spark.fileserver.uri,http://10.10.68.66:55329)
(spark.cassandra.auth.username,username)
(spark.local.dir,/tmp/spark)
(spark.app.id,local-1436229075894)
(spark.storage.blockManagerHeartBeatMs,300000)
(spark.executor.id,<driver>)
(spark.storage.memoryFraction,0.5)
(spark.app.name,Count all entries 217885402)
(spark.shuffle.compress,false)
An assembled or packaged jar sent to standalone or mesos works fine! Suggestions?

We ran into the same issue in Spark 1.6.0 (there is already a bug report for it)
We fixed it by switching to the Kryo serializer (which you should be using anyway).
So it appears to be a bug in the default JavaSerializer.
Simply do the following to get rid of it:
new SparkConf().setAppName("Simple Application").set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

The cause was a large broadcast variable. Unsure why (as it fit in memory), but removing it from the test cases made it work.

Related

NoClassDefFoundError: kafka/api/OffsetRequest for Storm jar

I am trying to submit Storm topology to the cluster but I constantly get the same error:
Exception in thread "main" java.lang.NoClassDefFoundError: kafka/api/OffsetRequest
at org.apache.storm.kafka.KafkaConfig.<init>(KafkaConfig.java:48)
at org.apache.storm.kafka.trident.TridentKafkaConfig.<init>(TridentKafkaConfig.java:30)
at storm.StormStreaming$.main(StormStreaming.scala:41)
at storm.StormStreaming.main(StormStreaming.scala)
Caused by: java.lang.ClassNotFoundException: kafka.api.OffsetRequest
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 4 more
I submit jar file using
./storm jar /patht/storm-app.jar storm.StormStreaming
How can it be fixed? I tried aligning resources versions - Kafka and Storm - but it does not seem to work.
My build.sbt file:
scalaVersion := "2.12.8"
val sparkVersion = "2.4.4"
val flinkVersion = "1.9.1"
val stormVersion = "2.1.0"
val kafkaVersion = "2.4.0"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.6"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.6"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.6"
libraryDependencies ++= Seq(
"org.apache.kafka" %% "kafka" % kafkaVersion excludeAll(
ExclusionRule("org.slf4j", "slf4j-log4j12"),
ExclusionRule("log4j", "log4j"),
ExclusionRule("org.apache.zookeeper", "zookeeper")
),
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion,
"org.apache.spark" %% "spark-core" % sparkVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
"com.typesafe" % "config" % "1.3.3",
"org.twitter4j" % "twitter4j-core" % "4.0.7",
"org.twitter4j" % "twitter4j-stream" % "4.0.7",
"org.apache.flink" % "flink-core" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.flink" %% "flink-scala" % flinkVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.flink" %% "flink-clients" % flinkVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion,
"org.apache.flink" %% "flink-runtime-web" % flinkVersion,
"org.apache.flink" % "flink-avro-confluent-registry" % flinkVersion,
"org.apache.storm" % "storm-core" % stormVersion % "provided" excludeAll(
ExclusionRule("org.slf4j", "slf4j-log4j12"),
ExclusionRule("org.slf4j", "log4j-over-slf4j")
),
"org.apache.storm" % "storm-kafka-client" % stormVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.storm" % "storm-sql-core" % stormVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.storm" % "storm-sql-runtime" % stormVersion excludeAll("org.slf4j", "slf4j-log4j12"),
"org.apache.storm" % "storm-kafka" % "1.2.3" excludeAll("org.slf4j", "slf4j-log4j12")
)
You are using the wrong Kafka jar. You should depend on org.apache.kafka:kafka-clients instead of org.apache.kafka:kafka_2.xx, which is the Kafka server side jar.
The dependence on kafka/api/OffsetRequest is coming from storm-kafka, which should not be used. It's using an old Kafka client API which is no longer present in Kafka. Use storm-kafka-client instead.

How to convert RDD[some case class] to csv file using scala?

I have an RDD[some case class] and I want to convert it to a csv file. I am using spark 1.6 and scala 2.10.5 .
stationDetails.toDF.coalesce(1).write.format("com.databricks.spark.csv").save("data/myData.csv")
gives error
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:219)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:139)
I am not able to add the dependencies for "com.databricks.spark.csv" in my build.sbt file.
dependencies I added in build.sbt file are:
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-csv" % "1.1",
"com.univocity" % "univocity-parsers" % "1.5.1",
"org.slf4j" % "slf4j-api" % "1.7.5" % "provided",
"org.scalatest" %% "scalatest" % "2.2.1" % "test",
"com.novocode" % "junit-interface" % "0.9" % "test"
)
I also tried this
stationDetails.toDF.coalesce(1).write.csv("data/myData.csv")
but it gives error : csv cannot be resolved.
Please change your build.sbt to below -
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-csv" % "1.1",
"com.databricks" %% "spark-csv" % "1.4.0",
"com.univocity" % "univocity-parsers" % "1.5.1",
"org.slf4j" % "slf4j-api" % "1.7.5" % "provided",
"org.scalatest" %% "scalatest" % "2.2.1" % "test",
"com.novocode" % "junit-interface" % "0.9" % "test"
)

Akka Version mismatch in Class path Error

i am using SBT to setup Akka Persistence but it's failing with Error :
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-16]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-3]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
java.lang.NoSuchMethodError: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V
My Configuration SBT Build file is :
scalaVersion := "2.12.4"
lazy val root = (project in file("."))
.configs(IntegrationTest)
def akkaVersion = "2.5.6"
def akkaHttpVersion = "10.0.10"
def logbackVersion = "1.2.3"
def ItAndTest = "it, test"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion,
"com.redmart.fc" %% "fc-capacity-model" % "1.7.0",
"com.redmart.akka" %% "akka-downing" % "1.1.0",
"com.jsuereth" %% "scala-arm" % "2.0",
"com.typesafe" % "config" % "1.3.1",
"com.typesafe.scala-logging" %% "scala-logging" % "3.7.2",
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"ch.qos.logback" % "logback-classic" % logbackVersion,
"ch.qos.logback" % "logback-access" % logbackVersion,
"net.logstash.logback" % "logstash-logback-encoder" % "4.11",
"joda-time" % "joda-time" % "2.9.3",
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-persistence-cassandra" % "0.58",
// "com.typesafe.akka" %% "akka-cluster" % akkaVersion,
// "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.4",
// "org.iq80.leveldb" % "leveldb" % "0.9",
// "org.fusesource.leveldbjni" % "leveldbjni-all" % "1.8",
"io.spray" %% "spray-json" % "1.3.3",
"com.enragedginger" %% "akka-quartz-scheduler" % "1.6.1-akka-2.5.x",
// "com.softwaremill.macwire" %% "macros" % "2.2.2" % "provided",
// "com.softwaremill.macwire" %% "util" % "2.2.2",
"com.esotericsoftware" % "kryo" % "4.0.1",
"com.github.romix.akka" %% "akka-kryo-serialization" % "0.5.2",
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % ItAndTest,
"org.scalatest" %% "scalatest" % "3.0.4" % ItAndTest,
"org.mockito" % "mockito-core" % "2.10.0" % ItAndTest,
"com.github.dnvriend" %% "akka-persistence-inmemory" % "2.5.1.1" % ItAndTest
)
dependencyOverrides += "com.typesafe.akka" %% "akka-actor" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-stream" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-persistence" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-slf4j" % akkaVersion
dependencyOverrides += "com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion
Can Anyone explain the reason why it's failing? I am sure i am overriding to latest package. which is 2.5.6 using Dependency Overrides. i am Using SBT 0.13.
Use the current version (0.5) of akka-management-cluster-http:
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.5"
Version 0.4 depends on an older version of Akka.

play.api.UnexpectedException: Unexpected exception[NoClassDefFoundError: slick/backend/DatabaseConfig]

I'm tring to setup a project with play framework and postgres with slick-codegen. My project is compiling without errors And the Tables class generated is correct. But when I run the project it give me the play.api.UnexpectedException: Unexpected exception[NoClassDefFoundError: slick/backend/DatabaseConfig].
This is the full trace:
play.api.UnexpectedException: Unexpected exception[NoClassDefFoundError: slick/backend/DatabaseConfig]
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1$$anonfun$apply$1$$anonfun$1.apply(DevServerStart.scala:184)
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1$$anonfun$apply$1$$anonfun$1.apply(DevServerStart.scala:131)
at scala.Option.map(Option.scala:146)
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1$$anonfun$apply$1.apply(DevServerStart.scala:131)
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1$$anonfun$apply$1.apply(DevServerStart.scala:129)
at scala.util.Success.flatMap(Try.scala:231)
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:129)
at play.core.server.DevServerStart$$anonfun$mainDev$1$$anon$1$$anonfun$get$1.apply(DevServerStart.scala:121)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
Caused by: java.lang.NoClassDefFoundError: slick/backend/DatabaseConfig
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethods(Class.java:1975)
at com.google.inject.spi.InjectionPoint.getInjectionPoints(InjectionPoint.java:688)
at com.google.inject.spi.InjectionPoint.forInstanceMethodsAndFields(InjectionPoint.java:380)
at com.google.inject.internal.ConstructorBindingImpl.getInternalDependencies(ConstructorBindingImpl.java:165)
at com.google.inject.internal.InjectorImpl.getInternalDependencies(InjectorImpl.java:616)
at com.google.inject.internal.InjectorImpl.cleanup(InjectorImpl.java:572)
at com.google.inject.internal.InjectorImpl.initializeJitBinding(InjectorImpl.java:558)
at com.google.inject.internal.InjectorImpl.createJustInTimeBinding(InjectorImpl.java:887)
Caused by: java.lang.ClassNotFoundException: slick.backend.DatabaseConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethods(Class.java:1975)
at com.google.inject.spi.InjectionPoint.getInjectionPoints(InjectionPoint.java:688)
at com.google.inject.spi.InjectionPoint.forInstanceMethodsAndFields(InjectionPoint.java:380)
at com.google.inject.internal.ConstructorBindingImpl.getInternalDependencies(ConstructorBindingImpl.java:165)
at com.google.inject.internal.InjectorImpl.getInternalDependencies(InjectorImpl.java:616)
I have my project split in 2 subprojects. The Play project and the codegen project that is used to generate the Tables class.
My build.sbt looks like this:
import sbt.Keys._
val slickVersion = "3.2.0"
scalaVersion := "2.11.8"
// The Play project itself
lazy val root = (project in file("."))
.settings(
libraryDependencies ++= List(
"com.netaporter" %% "scala-uri" % "0.4.14",
"net.codingwell" %% "scala-guice" % "4.1.0",
"com.typesafe.play" %% "play-slick" % "2.0.2",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.2"
)
)
.settings(sharedSettings)
.enablePlugins(Common, PlayScala)
.settings(
name := """Play app"""
)
.dependsOn(codegen)
lazy val codegen = project
.settings(sharedSettings)
.settings(
libraryDependencies ++= List(
"com.typesafe.slick" %% "slick-codegen" % slickVersion
)
)
lazy val sharedSettings = Seq(
scalaVersion := "2.11.8",
scalacOptions := Seq("-feature", "-unchecked", "-deprecation"),
libraryDependencies ++= List(
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
"org.slf4j" % "slf4j-api" % "1.7.23",
"org.slf4j" % "log4j-over-slf4j" % "1.7.23", // for any java classes looking for this
"ch.qos.logback" % "logback-classic" % "1.2.1",
"com.typesafe.slick" %% "slick" % slickVersion,
"org.postgresql" % "postgresql" % "9.4.1212",
"com.github.tminglei" %% "slick-pg" % "0.15.0-RC",
"com.github.tminglei" %% "slick-pg_play-json" % "0.15.0-RC",
"com.github.tminglei" %% "slick-pg_joda-time" % "0.15.0-RC",
"com.github.tminglei" %% "slick-pg_jts" % "0.15.0-RC",
"joda-time" % "joda-time" % "2.9.7",
"org.joda" % "joda-convert" % "1.8"
)
)
I also tried removing all the code that use database and the Tables class but i still get this error.
I use a custom postgres driver and it is in the codegen project but is i add the dependsOn(codegen) in the root project, it should not be a problem. Also it is compling without problems with sbt compile.
EDIT:
After some testings, i changed the slick version to 3.1.1 and com.github.tminglei versions to 0.14.3 and it now works for me.
I had the same classapth problem and in the end used the following versions in a play-2.5.x and scala-2.11 project:
libraryDependencies ++= List(
"com.typesafe.play" %% "play-slick" % "2.1.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.1.0",
"com.typesafe.slick" %% "slick" % "3.2.0",
"com.github.tototoshi" %% "slick-joda-mapper" % "2.3.0"
}
The slick.backend.DatabaseConfig comes from slick-3.1.0 that was needed by play-slick-2.0.2. Since you wanted slick-3.2.0 you need to use play-slick-2.1.0 see https://github.com/playframework/play-slick and https://github.com/tototoshi/slick-joda-mapper .

java.lang.NoSuchMethodError: play.api.Logger$.init(Ljava/io/File;Lscala/Enumeration$Value;)V

I am new to scala and sbt. After lot of investigations I have come up with below build.sbt. Am not able to find solution for this. It would be very helpful for me to proceed.
scalaVersion := "2.11.7"
scalaVersion in ThisBuild := "2.11.7"
val sparkVersion = "1.3.0"
val akkaVersion = "2.3.6"
libraryDependencies ++= Seq(
jdbc,
cache,
"com.typesafe.play" % "play_2.11" % "2.5.0-M2",
"org.slf4j" % "jcl-over-slf4j" % "1.7.18",
"org.slf4j" % "slf4j-simple" % "1.6.2",
"com.typesafe.akka" % "akka-actor_2.11" % "2.4.2",
"com.typesafe.akka" % "akka-slf4j_2.11" % "2.4.2",
"org.webjars" % "webjars-play_2.11" % "2.4.0-2",
"com.typesafe.play" % "play-ws_2.11" % "2.5.0-M2",
"org.webjars" % "bootstrap" % "3.2.0",
"org.webjars" % "html5shiv" % "3.7.0",
"org.webjars" % "respond" % "1.4.2",
"com.twitter" %% "algebird-core" % "0.9.0", // Twitter Algebird
"net.databinder.dispatch" % "dispatch-core_2.11" % "0.11.3",
"org.reactivemongo" % "play2-reactivemongo_2.11" % "0.11.10.play23",
"org.specs2" %% "specs2-core" % "3.6.5" % "test",
"org.specs2" %% "specs2-junit" % "3.6.5" % "test",
"org.specs2" %% "specs2-mock" % "3.6.5" % "test",
"org.mongodb" %% "casbah" % "2.8.1", // MongoDB Casbah
"com.sksamuel.elastic4s" %% "elastic4s" % "1.4.14"
)
Compilations is fine but getting runtime error, am not sure where is the error.
Error:
java.lang.NoSuchMethodError: play.api.Logger$.init(Ljava/io/File;Lscala/Enumeration$Value;)V
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:88)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:223)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:100)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:53)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) java.lang.reflect.InvocationTargetException
[error] Total time: 1 s, completed 7 Mar, 2016 8:05:59 PM
You cannot mix and match different Play versions.
ReactiveMongo "0.11.10.play23" actually needs Play 2.3. You tried to use the milestone 2 Play 2.5 development version.
The latest official Play version is 2.4.6 currently, so don't use a development milestone (M2) of Play 2.5 which is currently not yet finalized.
There's a matching ReactiveMongo version for Play 2.4. Use that.
Furthermore, when you want to develop a Play application, you should just use the (typesafe) lightbend "Activator" which generates a proper project template using the Play SBT plugin (which has proper template and routes compiliation support et cetera).