Exception in Play Framework 2.6 - scala

I Was using play framework 2.5.
My scala version is 2.11.8.
When I migrated to Paly 2.6, I'm getting the following error
RuntimeException: java.lang.NoClassDefFoundError: play/api/libs/ws/WSRequest$class

WS was extracted to it's own library, so you need to add in the build.sbt:
libraryDependencies += ws
You can find more details in the official documentation: https://www.playframework.com/documentation/2.6.x/ScalaWS
And do not forget the migration guide:
https://www.playframework.com/documentation/2.6.x/WSMigration26

Related

getting NoClassDefFoundError using elastic4s

I'm using play 2.6 and scala 2.12 in my app, and elastic search using elastic4s.
In my build.sbt:
"com.sksamuel.elastic4s" %% "elastic4s-core" % "5.6.0",
"com.sksamuel.elastic4s" %% "elastic4s-http" % "5.6.0",
and I keep getting this exception in my log:
Exception in thread "I/O dispatcher 16" java.lang.NoClassDefFoundError: Could not initialize class com.sksamuel.elastic4s.http.JacksonSupport$
at com.sksamuel.elastic4s.http.ResponseHandler$.fromEntity(ResponseHandler.scala:47)
at com.sksamuel.elastic4s.http.DefaultResponseHandler.$anonfun$onResponse$1(ResponseHandler.scala:55)
at scala.util.Try$.apply(Try.scala:209)
at com.sksamuel.elastic4s.http.DefaultResponseHandler.onResponse(ResponseHandler.scala:55)
at com.sksamuel.elastic4s.http.HttpExecutable$RichRestClient$$anon$1.onSuccess(HttpExecutable.scala:27)
at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onSuccess(RestClient.java:597)
at org.elasticsearch.client.RestClient$1.completed(RestClient.java:352)
at org.elasticsearch.client.RestClient$1.completed(RestClient.java:343)
at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:119)
at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:177)
at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:436)
at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:326)
at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265)
at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81)
at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39)
at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114)
at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276)
at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
at java.lang.Thread.run(Thread.java:748)
Trying to figure out what could cause this.
In another log also saw:
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.8.8 requires Jackson Databind version >= 2.8.0 and < 2.9.0
so it looks like something with the jackson version, or also with the jackson version, but not sure where do I update it or where is it failing exactly...
Someone ever had this issue?
thanks!
Double check if you are importing elastic4s-jackson:
libraryDependencies += "com.sksamuel.elastic4s" %% "elastic4s-jackson" % "5.6.0"
I have had the exact same issue with play 2.8 and elastic4s 6.7.3 using the jackson module too
What helped me was running evicted and looking at the dependencies each library had and figuring out why it cannot load that particular class . ultimately upgrading to 6.7.8 ended up resolving the issue.
Also if you look in the buid.sbt file for the elasic4s project it lists all the versions it is using https://github.com/sksamuel/elastic4s/blob/master/build.sbt
You can use jackson-module-scala and exclude the com.fasterxml.jackson which is referenced in elastic4s-http and verify that com.fasterxml.jackson is available in the libraries/classpath when the application is running.

Exception caught in Netty in play framework

I have created a new play framework app in SCALA. If i run the application it runs as expected and i am able to open the URL localhost:9000, but after i created a lib folder and added some jars(spark jars) in that when i tries to run the application using the command play run, it shows the message
[error] p.nettyException - Exception caught in Netty
java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:305) ~[akka-actor_2.10.jar:2.2.0]
at akka.actor.ActorCell$.<clinit>(ActorCell.scala) ~[akka-actor_2.10.jar:2.2.0]
at akka.actor.RootActorPath.$div(ActorPath.scala:152) ~[akka-actor_2.10.jar:2.2.0]
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465) ~[akka-actor_2.10.jar:2.2.0]
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:453) ~[akka-actor_2.10.jar:2.2.0]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_91]
[error] p.nettyException - Exception caught in Netty
java.lang.NoClassDefFoundError: Could not initialize class play.api.libs.concurrent.Execution$
when i tries to open the URL localhost:9000 it shows "localhost page is not working".
Spark Version 2.0
Scala Version 2.10 and also i have tested with scala 2.11 and 2.12 same error reproducing.
Play framework version - 2.2.6
First, if you created a new Play! project, why use version 2.2? The newest one is 2.5 and that may already fix your issue.
Second, to ensure you don't have any other version conflicts I suggest you use managed dependencies instead of putting them manually into the lib folder.
For Apache Spark, you would add this line to your build.sbt file:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
as described in Spark's Quick Start Guide. You need to use Scala 2.10 or 2.11 (suggested) in your project.
If that doesn't resolve your issue, please post the contents of your build.sbt and the controller code. Thanks.

UnsupportedClassVersionError on play with JDK 1.7

I am getting the same error as this post. i'm trying to resolve the problem as mentioned in the proposed solution but i didn't understand how ?
If you are using version 2.4.x (or newer), you must use Java 8. From the Highlights of version 2.4:
Play 2.4 now requires JDK 8. Due to this, Play can, out of the box, provide support for Java 8 data types. For example, Play’s JSON APIs now support Java 8 temporal types including Instance, LocalDateTime and LocalDate.
To confirm that you are using Play 2.4, see file project/plugins.sbt.
Edit:
If you can't (or don't want to) use Java 8, you have to use Play 2.3 instead. To do so, you must edit project/plugins.sbt to change the used version of Play:
// Notice we are now using version 2.3.10
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.10")
If this is a brand new project, you can recreate it using 2.3 template instead:
activator new play-scala-2.3 name-of-your-project
Or, for Java:
activator new play-java-2.3 name-of-your-project

Error: value seq is not a member of object slick.dbio.DBIO

I am writing a web application in play framework. I decided to use slick (FRM) to query in my database (postgre). I am new to slick so I started following slick official document for revision 3.0.0
http://slick.typesafe.com/doc/3.0.0/gettingstarted.html
As per the documentation, I added dependencies in my build.sbt file
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "3.0.0",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
Everything else is working fine but while writing below line in my Scala IDE. It is showing error that
value seq is not a member of object slick.dbio.DBIO
val setup = DBIO.seq(
// Above line is showing error
(suppliers.schema ++ coffees.schema).create,
.
.
.
)
In fact Scala IDE (same as eclipse IDE) isn't detecting any member of object DBIO though when I browse slick api of same version http://slick.typesafe.com/doc/3.0.0/api/#slick.dbio.DBIO$ for object DBIO I am seeing seq as it's member.
Where am I doing wrong?
As anticipated by retronym and Chris Scot, this problem was fixed with the release of Slick 3.1
You need to import the API for the database you are using:
import slick.driver.PostgresDriver.api._
I'm not sure if you've got your answer (I'm answering for those that may stumble upon this in the future), but I've resolved this by upgrading to 3.0.0-M1 and using Action.seq() instead of DBIO.seq()
This may also work in 3.0.0, but I'm not sure as I upgraded from Slick 2.x to 3.0.0-M1!

configuring Play framework with Scala to use Neo4j graph db

I'm using scala in my app, it run without problem, but if I add neo4j in the dependencies, it throws a NoSuchMethod error. I didn't even have references to any neo4j classes in my code...
I have no problem using play framework with scala, or play framework (Java) with neo4j, just when using both together, it crash...
and I tried both neo4j 1.4.2 and 1.5.M02 to no avail.
thanks for any help~
Chris
dependencies.yml
# Application dependencies
require:
- play
- play -> scala 0.9.1
- org.neo4j -> neo4j 1.4.2
exception details:
play.exceptions.UnexpectedException: Unexpected Error
at play.Invoker$Invocation.onException(Invoker.java:232)
at play.Invoker$Invocation.run(Invoker.java:273)
at Invocation.HTTP Request(Play!)
Caused by: java.lang.NoSuchMethodError: scala.collection.generic.GenericTraversableTemplate.flatten(Lscala/Function1;)Lscala/collection/Traversable;
at play.scalasupport.compiler.PlayScalaCompiler$.scanFiles(ScalaCompiler.scala:18)
at play.scalasupport.compiler.PlayScalaCompiler$$anonfun$scanFiles$1.apply(ScalaCompiler.scala:17)
at play.scalasupport.compiler.PlayScalaCompiler$$anonfun$scanFiles$1.apply(ScalaCompiler.scala:15)
at play.scalasupport.compiler.PlayScalaCompiler$.scanFiles(ScalaCompiler.scala:15)
at play.scalasupport.compiler.PlayScalaCompiler$$anonfun$scanFiles$1.apply(ScalaCompiler.scala:17)
at play.scalasupport.compiler.PlayScalaCompiler$$anonfun$scanFiles$1.apply(ScalaCompiler.scala:15)
at play.scalasupport.compiler.PlayScalaCompiler$.scanFiles(ScalaCompiler.scala:15)
at play.scalasupport.ScalaPlugin$$anonfun$templates$1.apply(ScalaPlugin.scala:178)
at play.scalasupport.ScalaPlugin$$anonfun$templates$1.apply(ScalaPlugin.scala:177)
at play.scalasupport.ScalaPlugin.templates(ScalaPlugin.scala:177)
at play.scalasupport.ScalaPlugin.update(ScalaPlugin.scala:195)
at play.scalasupport.ScalaPlugin.detectClassesChange(ScalaPlugin.scala:107)
at play.plugins.PluginCollection.detectClassesChange(PluginCollection.java:358)
at play.Play.detectChanges(Play.java:594)
at play.Invoker$Invocation.init(Invoker.java:186)
... 1 more
it turns out that scala-library-2.9.0-1.jar is included as a dependency for neo4j...
every time I run play dependencies it will be downloaded into the lib folder, after I delete it from the folder, it works without problem (so far I just have code for starting and shutting down of the DB).
The Cypher Query language depends on Scala. We will update that dependency to 2.9.1 for the 1.5 release.