scala trouble with play framework JSON serialization from case class - scala

Play 2.2.1, scala 2.10
// PersonModel.scala
case class PersonModel(name: String, age: Long)
object PersonModel2 {
implicit object PersonModelFormat extends Format[PersonModel] {
def reads(json: JsValue): PersonModel = PersonModel(
(json \ "name").as[String],
(json \ "age").as[Long])
def writes(u: PersonModel): JsValue = JsObject(List(
"name" -> JsString(u.name),
"age" -> JsNumber(u.age)))
}
sbt says
[error] PersonModel.scala:15: overriding method reads in trait Reads of type (json: play.api.libs.json.JsValue)play.api.libs.json.JsResult[models.PersonModel];
[error] method reads has incompatible type
[error] def reads(json: JsValue): PersonModel = PersonModel(
[error] ^

In this case, since you're not doing fancy things with the output json (like changing the key names in the resulting json object), I'd go for:
case class PersonModel(name: String, age: Long)
import play.api.libs.json._
implicit val personModelFormat = Json.format[PersonModel]
This way, you can, for example
scala> val j = PersonModel("julien", 35)
j: PersonModel = PersonModel(julien,35)
scala> println(Json.toJson(j))
{"name":"julien","age":35}
More info can be found here
HTH,
Julien

Things have changed in recent versions, I'd say for the better. In 2.2.x, you'd do it this way, using the new functional syntax and combinators:
import play.api.libs.json._
import play.api.libs.functional.syntax._
implicit val PersonModelFormat: Format[PersonModel] = (
(__ \ "name").format[String] and
(__ \ "age").format[Long]
)(PersonModel.apply, unlift(PersonModel.unapply))
Much shorter!
The documentation for 2.2.x http://www.playframework.com/documentation/2.2.1/ScalaJsonCombinators provides a good explanation for the rationale for the change.

For single usage there is an inline solution:
Json.writes[PersonModel].writes(personModelInstance) // returns JsObject
From documentation:
macro-compiler replaces Json.writes[User] by injecting into compile chain the exact code you would write yourself

Related

Could not find implicit value for parameter write eror, yet I defined the handler using the macro

I have the following:
Account.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.bson._
case class Account(id: String, name: String)
object Account {
type ID = String
implicit val accountHandler: BSONDocumentHandler[Account] = Macros.handler[Account]
// implicit def accountWriter: BSONDocumentWriter[Account] = Macros.writer[Account]
// implicit def accountReader: BSONDocumentReader[Account] = Macros.reader[Account]
}
AccountRepo.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.collections.bson.BSONCollection
import scala.concurrent.ExecutionContext
final class AccountRepo(
val coll: BSONCollection
)(implicit ec: ExecutionContext) {
import Account.{ accountHandler, ID }
def insertTest() = {
val doc = Account(s"account123", "accountName") //, Instant.now)
coll.insert.one(doc)
}
}
The error I am getting is:
could not find implicit value for parameter writer: AccountRepo.this.coll.pack.Writer[modules.accounts.Account]
[error] coll.insert.one(doc)
From what I understand the implicit handler that is generated by the macro should be enough and create the Writer. What am I doing wrong?
Reference: http://reactivemongo.org/releases/1.0/documentation/bson/typeclasses.html
The code is mismixing different versions.
The macro generated handler is using the new BSON API, as it can be seen with the import reactivemongo.api.bson, whereas the collection is using an old driver, as it can be seen as it uses reactivemongo.api.collections.bson instead of reactivemongo.api.bson.collection.
It's recommended to have a look at the documentation, and not mixing incompatible versions of related libraries.

How to infer StructType schema for Spark Scala at run time given a Fully Qualified Name of a case class

Since a few days I was wondering if it is possible to infer a schema for Spark in Scala for a given case class, but unknown at compile time.
The only input is a string containing the FQN of the class (that could be used for example to create an instance of the case class at runtime via reflection)
I was thinking if it was possible to do something like:
package com.my.namespace
case class MyCaseClass (name: String, num: Int)
//Somewhere else in codebase
// coming from external configuration file, so unknown at compile time
val fqn = "com.my.namespace.MyCaseClass"
val schema = Encoders.product [ getXYZ( fqn ) ].schema
Of course, any other techniques that is not using Encoders is fine (building StructType analysing an instance of the case class ? Is it even possible ?)
What is the best approach?
Is it something feasible ?
You can use reflective toolbox
package com.my.namespace
import org.apache.spark.sql.types.StructType
import scala.reflect.runtime
import scala.tools.reflect.ToolBox
case class MyCaseClass (name: String, num: Int)
object Main extends App {
val fqn = "com.my.namespace.MyCaseClass"
val runtimeMirror = runtime.currentMirror
val toolbox = runtimeMirror.mkToolBox()
val res = toolbox.eval(toolbox.parse(s"""
import org.apache.spark.sql.Encoders
Encoders.product[$fqn].schema
""")).asInstanceOf[StructType]
println(res) // StructType(StructField(name,StringType,true),StructField(num,IntegerType,false))
}

Scala Play readNullable can't read Map types

I am trying to use play.api.lib.json to convert a json to my object. But then this happend...
case class Foo(foo:Option[Map[String,String]])
case class Bar(bar:String,foo:Foo)
def barJsonToModel(foobarJson:JsValue):Bar = {
implicit val fooReads: Reads[Foo] = (
( JsPath \ "foo" ).readNullable[Map[String,String]]
)(Foo.apply _)
}
Expression of type Reads[Option[Map[String,String]]] doesn't comfort to expect type Reads[Foo]
You are using the functional syntax of play-json, but so an import is missing before:
import play.api.libs.functional.syntax._

using datetime/timestamp in scala slick

is there an easy way to use datetime/timestamp in scala? What's best practice? I currently use "date" to persist data, but I'd also like to persist the current time.
I'm struggling to set the date. This is my code:
val now = new java.sql.Timestamp(new java.util.Date().getTime)
I also tried to do this:
val now = new java.sql.Date(new java.util.Date().getTime)
When changing the datatype in my evolutions to "timestamp", I got an error:
case class MyObjectModel(
id: Option[Int],
title: String,
createdat: Timestamp,
updatedat: Timestamp,
...)
object MyObjectModel{
implicit val myObjectFormat = Json.format[MyObjectModel]
}
Console:
app\models\MyObjectModel.scala:31: No implicit format for
java.sql.Timestamp available.
[error] implicit val myObjectFormat = Json.format[MyObjectModel]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Update:
object ProcessStepTemplatesModel {
implicit lazy val timestampFormat: Format[Timestamp] = new Format[Timestamp] {
override def reads(json: JsValue): JsResult[Timestamp] = json.validate[Long].map(l => Timestamp.from(Instant.ofEpochMilli(l)))
override def writes(o: Timestamp): JsValue = JsNumber(o.getTime)
}
implicit val processStepFormat = Json.format[ProcessStepTemplatesModel]
}
try using this in your code
implicit object timestampFormat extends Format[Timestamp] {
val format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SS'Z'")
def reads(json: JsValue) = {
val str = json.as[String]
JsSuccess(new Timestamp(format.parse(str).getTime))
}
def writes(ts: Timestamp) = JsString(format.format(ts))
}
it is (de)serialized in a JS compatible format like the following "2018-01-06T18:31:29.436Z"
please note: the implicit object shall be decleared in the code before it is used
I guess your question is handled in What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?.
Go with Java 8 "java.time".
In the subject you mention Slick (Scala Database Library) but the error you got comes from a Json library and it says that you don't have a converter for java.sql.Timestamp to Json. Without knowing which Json library you are using it's hard to help you with a working example.

Op-Rabbit with Spray-Json in Akka Http

I am trying to use the library Op-Rabbit to consume a RabbitMQ queue in an Akka-Http project.
I want to use Spray-Json for the marshalling/ un marshalling.
import com.spingo.op_rabbit.SprayJsonSupport._
import com.spingo.op_rabbit.stream.RabbitSource
import com.spingo.op_rabbit.{Directives, RabbitControl}
object Boot extends App with Config with BootedCore with ApiService {
this: ApiService with Core =>
implicit val materializer = ActorMaterializer()
Http().bindAndHandle(routes, httpInterface, httpPort)
log.info("Http Server started")
implicit val rabbitControl = system.actorOf(Props[RabbitControl])
import Directives._
RabbitSource(
rabbitControl,
channel(qos = 3),
consume(queue(
"such-queue",
durable = true,
exclusive = false,
autoDelete = false)),
body(as[User])).
runForeach { user =>
log.info(user)
} // after each successful iteration the message is acknowledged.
}
In a separate file:
case class User(id: Long,name: String)
object JsonFormat extends DefaultJsonProtocol {
implicit val format = jsonFormat2(User)
}
The error I am getting is:
could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.FromRequestUnmarshaller[*.*.models.User]
[error] body(as[User])). // marshalling is automatically hooked up using implicits
[error] ^
[error]could not find implicit value for parameter um: com.spingo.op_rabbit.RabbitUnmarshaller[*.*.models.User]
[error] body(as[User])
[error] ^
[error] two errors found
Im not sure how to get the op-rabbit spray-json support working properly.
Thanks for any help.
Try to provide an implicit marshaller for your User class like they do it for Int (in RabbitTestHelpers.scala):
implicit val simpleIntMarshaller = new RabbitMarshaller[Int] with RabbitUnmarshaller[Int] {
val contentType = "text/plain"
val contentEncoding = Some("UTF-8")
def marshall(value: Int) =
value.toString.getBytes
def unmarshall(value: Array[Byte], contentType: Option[String], charset: Option[String]) = {
new String(value).toInt
}
}