JSON4S does not serialize internal case class members - scala

I have a case class inheriting from a trait:
trait Thing {
val name: String
val created: DateTime = DateTime.now
}
case class Door(override val name: String) extends Thing
This is akka-http, and I'm trying to return JSON to a get request:
...
~
path ("get" / Segment) { id =>
get {
onComplete(doorsManager ? ThingsManager.Get(id)) {
case Success(d: Door) => {
complete(200, d)
}
case Success(_) => {
complete(404, s"door $id not found")
}
case Failure(reason) => complete(500, reason)
}
}
} ~
...
but I only get the JSON of name. I do have the implicit Joda serializers in scope.
if i override the 'created' timestamp in the constructor of the case class, it does get serialized, but it defines the purpose, as I don't need (or want) the user to provide the timestamp. I've tried moving the timestamp into Door (either as override or just by skipping the trait) and the result is the same (that is, no 'created').
how do I tell JSON4S to serialize internal members (and inherited ones) too?

You have to define a custom format.
import org.json4s.{FieldSerializer, DefaultFormats}
import org.json4s.native.Serialization.write
case class Door(override val name: String) extends Thing
trait Thing {
val name: String
val created: DateTime = DateTime.now
}
implicit val formats = DefaultFormats + FieldSerializer[Door with Thing()]
val obj = new Door("dooor")
write(obj)

Related

spray-json Cannot find JsonWriter or JsonFormat type class for Class

I still get the same error, I have defined the marshaller (and imported it); it appears that the case class entry is not in context when the function is polymorphic. and this throws a Cannot find JsonWriter or JsonFormat type class for Case Class. Is there a reason why spray-json can not find the implicit marshaller for the case class, (even when defined) is this case class in context? Link to marshaller
import spray.json._
import queue.MedusaJsonProtocol._
object MysqlDb {
...
}
case class UserDbEntry(
id: Int,
username: String,
countryId: Int,
created: LocalDateTime
)
trait MysqlDb {
implicit lazy val pool = MysqlDb.pool
}
trait HydraMapperT extends MysqlDb {
val FetchAllSql: String
def fetchAll(currentDate: String): Future[List[HydraDbRow]]
def getJson[T](row: T): String
}
object UserHydraDbMapper extends HydraMapperT {
override val FetchAllSql = "SELECT * FROM user WHERE created >= ?"
override def fetchAll(currentDate: String): Future[List[UserDbEntry]] = {
pool.sendPreparedStatement(FetchAllSql, Array(currentDate)).map { queryResult =>
queryResult.rows match {
case Some(rows) =>
rows.toList map (x => rowToModel(x))
case None => List()
}
}
}
override def getJson[UserDbEntry](row: UserDbEntry): String = {
HydraQueueMessage(
tableType = HydraTableName.UserTable,
payload = row.toJson.toString()
).toJson.toString()
}
private def rowToModel(row: RowData): UserDbEntry = {
UserDbEntry (
id = row("id").asInstanceOf[Int],
username = row("username").asInstanceOf[String],
countryId = row("country_id").asInstanceOf[Int],
created = row("created").asInstanceOf[LocalDateTime]
)
}
}
payload = row.toJson.toString() Can't find marshaller for UserDbEntry
You have defined UserDbEntry locally and there is no JSON marshaller for that type. Add the following:
implicit val userDbEntryFormat = Json.format[UserDbEntry]
I'm not sure how you can call row.toJson given UserDbEntry is a local case class. There must be a macro in there somewhere, but it's fairly clear that it's not in scope for the local UserDbEntry.
Edit
Now that I see your Gist, it looks like you have a package dependency problem. As designed, it'll be circular. You have defined the JSON marshaller in package com.at.medusa.core.queue, which imports UserDbEntry, which depends on package com.at.medusa.core.queue for marshalling.

Multiple slick `column`s for the same DB column break projection

I'm new to Slick thus I'm not sure whether the problem caused by incorrect usage of implicits or Slick doesn't allow doing what I'm trying to do.
In short I use Slick-pg extension for JSONB support in Postgres. I also use spray-json to deserialize JSONB fields into case classes.
In order to automagically convert columns into objects I wrote generic implicit JsonColumnType that you can see below. It allows me to have any case class for which I defined json formatter to be converted to jsonb field.
On the other hand I want to have alias of JsValue type for the same column so that I can use JSONB-operators.
import com.github.tminglei.slickpg._
import com.github.tminglei.slickpg.json.PgJsonExtensions
import org.bson.types.ObjectId
import slick.ast.BaseTypedType
import slick.jdbc.JdbcType
import spray.json.{JsValue, RootJsonWriter, RootJsonReader}
import scala.reflect.ClassTag
trait MyPostgresDriver extends ExPostgresDriver with PgArraySupport with PgDate2Support with PgRangeSupport with PgHStoreSupport with PgSprayJsonSupport with PgJsonExtensions with PgSearchSupport with PgNetSupport with PgLTreeSupport {
override def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
override val api = MyAPI
private val plainAPI = new API with SprayJsonPlainImplicits
object MyAPI extends API with DateTimeImplicits with JsonImplicits with NetImplicits with LTreeImplicits with RangeImplicits with HStoreImplicits with SearchImplicits with SearchAssistants { //with ArrayImplicits
implicit val ObjectIdColumnType = MappedColumnType.base[ObjectId, Array[Byte]](
{ obj => obj.toByteArray }, { arr => new ObjectId(arr) }
)
implicit def JsonColumnType[T: ClassTag](implicit reader: RootJsonReader[T], writer: RootJsonWriter[T]) = {
val columnType: JdbcType[T] with BaseTypedType[T] = MappedColumnType.base[T, JsValue]({ obj => writer.write(obj) }, { json => reader.read(json) })
columnType
}
}
}
object MyPostgresDriver extends MyPostgresDriver
Here is how my table is defined (minimized version)
case class Article(id : ObjectId, ids : Ids)
case class Ids(doi: Option[String], pmid: Option[Long])
class ArticleRow(tag: Tag) extends Table[Article](tag, "articles") {
def id = column[ObjectId]("id", O.PrimaryKey)
def idsJson = column[JsValue]("ext_ids")
def ids = column[Ids]("ext_ids")
private val fromTuple: ((ObjectId, Ids)) => Article = {
case (id, ids) => Article(id, ids)
}
private val toTuple = (v: Article) => Option((v.id, v.ids))
def * = ProvenShape.proveShapeOf((id, ids) <> (fromTuple, toTuple))(MappedProjection.mappedProjectionShape)
}
private val articles = TableQuery[ArticleRow]
Finally I have function that looks up articles by value of json field
def getArticleByDoi(doi : String): Future[Article] = {
val query = (for (a <- articles if (a.idsJson +>> "doi").asColumnOf[String] === doi) yield a).take(1).result
slickDb.run(query).map { items =>
items.headOption.getOrElse(throw new RuntimeException(s"Article with doi $doi is not found"))
}
}
Sadly I get following exception in runtime
java.lang.ClassCastException: spray.json.JsObject cannot be cast to server.models.db.Ids
The problem is in SpecializedJdbcResultConverter.base where ti.getValue is being called with wrong ti. It should be slick.driver.JdbcTypesComponent$MappedJdbcType but instead it's com.github.tminglei.slickpg.utils.PgCommonJdbcTypes$GenericJdbcType. As result wrong type is passed into my tuple converter.
What makes Slick choose different type for column even though there is explicit definition of projection in table row class ?
Sample project that demonstrates the issue is here.

Serializing/Deserializing case objects with Gson in Scala

I am using Gson to serialize and deserialize objects, and saving the results in Redis. ie object is serialized into json string then put in Redis, when the object is retreived, it is string then I use Gson.fromjson(str, className) to deserialize into object.
I am beginner with Scala so I assume my usage is incorrect.
I have the following class:
case class Status(id: String, state: State)
where State is the following:
sealed trait State {}
case object COMPLETED_SUCCESSFULLY extends State {}
case object FINISHED_POLLING extends State {}
case object CURRENTLY_DOWNLOADING extends State {}
case object FINISHED_DOWNLOADING extends State {}
case object CURRENTLY_UPLOADING extends State {}
case object FINISHED_UPLOADING extends State {}
I want to serialize Status into a json string then deserialize it back into an object.
But, when I serialize Status using Gson, I get:
"{\"id\":\"foo\",\"state\":{}}"
Why is that?
Ex:
val Status = new Status("foo", COMPLETED_SUCCESSFULLY)
I expect the serialized output to be
"{\"id\":\"foo\",\"state\":\"COMPLETED_SUCCESSFULLY\"}"
By default, case objects are serialized by Gson to empty json objects: {}. You have to write custom serializer to get expected behaviour:
object StateSerializer extends JsonSerializer[State] {
override def serialize(t1: State, t2: Type, jsonSerializationContext: JsonSerializationContext): JsonElement = {
val res = new JsonObject()
res.add("name", new JsonPrimitive(t1.toString))
res
}
}
val gson = new GsonBuilder().registerTypeHierarchyAdapter(classOf[State], StateSerializer)
.registerTypeHierarchyAdapter(classOf[State], StateDeserializer).setPrettyPrinting().create()
println(gson.toJson(COMPLETED_SUCCESSFULLY))
Will print:
{
"name": "COMPLETED_SUCCESSFULLY"
}
Also if you want to transform json to case object you have to implement JsonDeserializer:
object StateDeserializer extends JsonDeserializer[State] {
override def deserialize(json: JsonElement, typeOfT: Type, context: JsonDeserializationContext): State = {
val res = json match {
case o: JsonObject if o.has("name") && o.entrySet().size() == 1 =>
val name = o.get("name").getAsString
name match {
case "FINISHED_POLLING" => FINISHED_POLLING
case "FINISHED_DOWNLOADING" => FINISHED_DOWNLOADING
case "FINISHED_UPLOADING" => FINISHED_UPLOADING
case "CURRENTLY_DOWNLOADING" => CURRENTLY_DOWNLOADING
case "CURRENTLY_UPLOADING" => CURRENTLY_UPLOADING
case "COMPLETED_SUCCESSFULLY" => COMPLETED_SUCCESSFULLY
case _ => null
}
case _ => null
}
Option(res).getOrElse(throw new JsonParseException(s"$json can't be parsed to State"))
}
}
println(gson.fromJson("{\"name\": \"COMPLETED_SUCCESSFULLY\"}", classOf[State]))
Will print:
COMPLETED_SUCCESSFULLY

required: spray.httpx.marshalling.ToResponseMarshallable Error

Hey im pretty new to Spray and reactive mongo .
Im trying to return a list of result as json but i'm having some issue with converting the result to list of json.
this is my model
import reactivemongo.bson.BSONDocumentReader
import reactivemongo.bson.BSONObjectID
import reactivemongo.bson.Macros
case class Post(id: BSONObjectID, likes: Long, message: String, object_id: String, shares: Long)
object Post {
implicit val reader: BSONDocumentReader[Post] = Macros.reader[Post]
}
the Mongo method
def getAll(): Future[List[Post]] ={
val query = BSONDocument(
"likes" -> BSONDocument(
"$gt" -> 27))
collection.find(query).cursor[Post].collect[List]()
}
and this is the route
val route1 =
path("posts") {
val res: Future[List[Post]]= mongoService.getAll()
onComplete(res) {
case Success(value) => complete(value)
case Failure(ex) => complete(ex.getMessage)
}
}
error
type mismatch; found : List[com.example.model.Post] required: spray.httpx.marshalling.ToResponseMarshallable
thanks,
miki
You'll need to define how a Post will be serialized, which you can do via a spray-json Protocol (see the docs for more detailed information). It's quite easy to do so, but before that, you'll also need to define a format for the BSONObjectId type, since there's no built-in support for that type in spray-json (alternatively, if object_id is a string representation of the BSONObjectId, think about removing the id property from your Post class or change it to be a String):
// necessary imports
import spray.json._
import spray.httpx.SprayJsonSupport._
implicit object BSONObjectIdProtocol extends RootJsonFormat[BSONObjectID] {
override def write(obj: BSONObjectID): JsValue = JsString(obj.stringify)
override def read(json: JsValue): BSONObjectID = json match {
case JsString(id) => BSONObjectID.parse(id) match {
case Success(validId) => validId
case _ => deserializationError("Invalid BSON Object Id")
}
case _ => deserializationError("BSON Object Id expected")
}
}
Now, we're able to define the actual protocol for the Post class:
object PostJsonProtocol extends DefaultJsonProtocol {
implicit val format = jsonFormat5(Post.apply)
}
Furthermore, we'll also need to make sure that we have the defined format in scope:
import PostJsonProtocol._
Now, everything will compile as expected.
One more thing: have a look at the docs about the DSL structure of spray. Your mongoService.getAll() isn't within a complete block, which might not reflect your intentions. This ain't an issue yet, but probably will be if your route get more complex. To fix this issue simply put the future into the onComplete call or make it lazy:
val route1 =
path("posts") {
onComplete(mongoService.getAll()) {
case Success(value) => complete(value)
case Failure(ex) => complete(ex.getMessage)
}
}

How to represent optional fields in spray-json?

I have an optional field on my requests:
case class SearchRequest(url: String, nextAt: Option[Date])
My protocol is:
object SearchRequestJsonProtocol extends DefaultJsonProtocol {
implicit val searchRequestFormat = jsonFormat(SearchRequest, "url", "nextAt")
}
How do I mark the nextAt field optional, such that the following JSON objects will be correctly read and accepted:
{"url":"..."}
{"url":"...", "nextAt":null}
{"url":"...", "nextAt":"2012-05-30T15:23Z"}
I actually don't really care about the null case, but if you have details, it would be nice. I'm using spray-json, and was under the impression that using an Option would skip the field if it was absent on the original JSON object.
Works for me (spray-json 1.1.1 scala 2.9.1 build)
import cc.spray.json._
import cc.spray.json.DefaultJsonProtocol._
// string instead of date for simplicity
case class SearchRequest(url: String, nextAt: Option[String])
// btw, you could use jsonFormat2 method here
implicit val searchRequestFormat = jsonFormat(SearchRequest, "url", "nextAt")
assert {
List(
"""{"url":"..."}""",
"""{"url":"...", "nextAt":null}""",
"""{"url":"...", "nextAt":"2012-05-30T15:23Z"}""")
.map(_.asJson.convertTo[SearchRequest]) == List(
SearchRequest("...", None),
SearchRequest("...", None),
SearchRequest("...", Some("2012-05-30T15:23Z")))
}
You might have to create an explicit format (warning: psuedocodish):
object SearchRequestJsonProtocol extends DefaultJsonProtocol {
implicit object SearchRequestJsonFormat extends JsonFormat[SearchRequest] {
def read(value: JsValue) = value match {
case JsObject(List(
JsField("url", JsString(url)),
JsField("nextAt", JsString(nextAt)))) =>
SearchRequest(url, Some(new Instant(nextAt)))
case JsObject(List(JsField("url", JsString(url)))) =>
SearchRequest(url, None)
case _ =>
throw new DeserializationException("SearchRequest expected")
}
def write(obj: SearchRequest) = obj.nextAt match {
case Some(nextAt) =>
JsObject(JsField("url", JsString(obj.url)),
JsField("nextAt", JsString(nextAt.toString)))
case None => JsObject(JsField("url", JsString(obj.url)))
}
}
}
Use NullOptions trait to disable skipping nulls:
https://github.com/spray/spray-json#nulloptions
Example:
https://github.com/spray/spray-json/blob/master/src/test/scala/spray/json/ProductFormatsSpec.scala
Don't know if this will help you but you can give that field a default value in the case class definition, so if the field is not in the json, it will assign the default value to it.
Easy.
import cc.spray.json._
trait MyJsonProtocol extends DefaultJsonProtocol {
implicit val searchFormat = new JsonWriter[SearchRequest] {
def write(r: SearchRequest): JsValue = {
JsObject(
"url" -> JsString(r.url),
"next_at" -> r.nextAt.toJson,
)
}
}
}
class JsonTest extends FunSuite with MyJsonProtocol {
test("JSON") {
val search = new SearchRequest("www.site.ru", None)
val marshalled = search.toJson
println(marshalled)
}
}
For anyone who is chancing upon this post and wants an update to François Beausoleil's answer for newer versions of Spray (circa 2015+?), JsField is deprecated as a public member of JsValue; you should simply supply a list of tuples instead of JsFields. Their answer is spot-on, though.