using spray-json how can I serialize following class
case class Vegetable(name: String, color: String, seller:ISeller)
here ISeller is a Java Interface. I am new to spray-json an not sure how this can be serialized and deserialized.
I tried this but it gives runtime error
implicit val VegetableFormat = jsonFormat3(Vegetable)
Any pointer here will be great.
You need to define a way to convert to/from JSON for your ISeller object.
In addition to code you supplied you need to define formatter for ISeller like the following:
implicit object ISellerJsonFormat extends RootJsonFormat[ISeller] {
def write(c: ISeller) = JsNull
def read(value: JsValue) = null
}
The snippet above just ignores ISeller so vegetable.toJson would produce:
{"name":"Onion","color":"red","seller":null}
If you want to read/write something more meaningful you can implement more complex logic. See the "Providing JsonFormats for other Types" section in https://github.com/spray/spray-json .
Related
No instance of play.api.libs.json.Format is available for akka.actor.typed.ActorRef[org.knoldus.eventSourcing.UserState.Confirmation] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
[error] implicit val userCommand: Format[AddUserCommand] = Json.format
I am getting this error even though I have made Implicit instance of Json Format for AddUserCommand.
Here is my code:
trait UserCommand extends CommandSerializable
object AddUserCommand{
implicit val format: Format[AddUserCommand] = Json.format[AddUserCommand]
}
final case class AddUserCommand(user:User, reply: ActorRef[Confirmation]) extends UserCommand
Can anyone please help me with this error and how to solve it?
As Gael noted, you need to provide a Format for ActorRef[Confirmation]. The complication around this is that the natural serialization, using the ActorRefResolver requires that an ExtendedActorSystem be present, which means that the usual approaches to defining a Format in a companion object won't quite work.
Note that because of the way Lagom does dependency injection, this approach doesn't really work in Lagom: commands in Lagom basically can't use Play JSON.
import akka.actor.typed.scaladsl.adapter.ClassicActorSystemOps
import play.api.libs.json._
class PlayJsonActorRefFormat(system: ExtendedActorSystem) {
def reads[A] = new Reads[ActorRef[A]] {
def reads(jsv: JsValue): JsResult[ActorRef[A]] =
jsv match {
case JsString(s) => JsSuccess(ActorRefResolver(system.toTyped).resolveActorRef(s))
case _ => JsError(Seq(JsPath() -> Seq(JsonValidationError(Seq("ActorRefs are strings"))))) // hopefully parenthesized that right...
}
}
def writes[A] = new Writes[ActorRef[A]] {
def writes(a: ActorRef[A]): JsValue = JsString(ActorRefResolver(system.toTyped).toSerializationFormat(a))
}
def format[A] = Format[ActorRef[A]](reads, writes)
}
You can then define a format for AddUserCommand as
object AddUserCommand {
def format(arf: PlayJsonActorRefFormat): Format[AddUserCommand] = {
implicit def arfmt[A]: Format[ActorRef[A]] = arf.format
Json.format[AddUserCommand]
}
}
Since you're presumably using JSON to serialize the messages sent around a cluster (otherwise, the ActorRef shouldn't be leaking out like this), you would then construct an instance of the format in your Akka Serializer implementation.
(NB: I've only done this with Circe, not Play JSON, but the basic approach is common)
The error says that it cannot construct a Format for AddUserCommand because there's no Format for ActorRef[Confirmation].
When using Json.format[X], all the members of the case class X must have a Format defined.
In your case, you probably don't want to define a formatter for this case class (serializing an ActorRef doesn't make much sense) but rather build another case class with data only.
Edit: See Levi's answer on how to provide a formatter for ActorRef if you really want to send out there the actor reference.
I'm was reading about cats and I encountered the following code snippet which is about serializing objects to JSON!
It starts with a trait like this:
trait JsonWriter[A] {
def write(value: A): Json
}
After this, there are some instances of our domain object:
final case class Person(name: String, email: String)
object JsonWriterInstances {
implicit val stringWriter: JsonWriter[String] =
new JsonWriter[String] {
def write(value: String): Json =
JsString(value)
}
implicit val personWriter: JsonWriter[Person] =
new JsonWriter[Person] {
def write(value: Person): Json =
JsObject(Map(
"name" -> JsString(value.name),
"email" -> JsString(value.email)
))
}
// etc...
}
So far so good! I can then use this like this:
import JsonWriterInstances._
Json.toJson(Person("Dave", "dave#example.com"))
Later on I come across something called the interface syntax, which uses extension methods to extend existing types with interface methods like below:
object JsonSyntax {
implicit class JsonWriterOps[A](value: A) {
def toJson(implicit w: JsonWriter[A]): Json =
w.write(value)
}
}
This then simplifies the call to serializing a Person as:
import JsonWriterInstances._
import JsonSyntax._
Person("Dave", "dave#example.com").toJson
What I don't understand is that how is the Person boxed into JsonWriterOps such that I can directly call the toJson as though toJson was defined in the Person case class itself. I like this magic, but I fail to understand this one last step about the JsonWriterOps. So what is the idea behind this interface syntax and how does this work? Any help?
This is actually a standard Scala feature, since JsonWriterOps is marked implicit and is in scope, the compiler can apply it at compilation-time when needed.
Hence scalac will do the following transformations:
Person("Dave", "dave#example.com").toJson
new JsonWriterOps(Person("Dave", "dave#example.com")).toJson
new JsonWriterOps[Person](Person("Dave", "dave#example.com")).toJson
Side note:
It's much more efficient to implicit classes as value classes like this:
implicit class JsonWriterOps[A](value: A) extends AnyVal
This makes the compiler also optimize away the new object construction, if possible, compiling the whole implicit conversion + method call to a simple function call.
Our finatra application uses json4s to serialize objects to jsons in our controller responses. However, I noticed that when trying to serialize enums, it creates an empty object.
I saw this response that would resolve my issue but would have to be replicated for each enum:
https://stackoverflow.com/a/35850126/2668545
class EnumSerializer[E <: Enum[E]](implicit ct: Manifest[E]) extends CustomSerializer[E](format ⇒ ({
case JString(name) ⇒ Enum.valueOf(ct.runtimeClass.asInstanceOf[Class[E]], name)
}, {
case dt: E ⇒ JString(dt.name())
}))
// first enum I could find
case class X(a: String, enum: java.time.format.FormatStyle)
implicit val formats = DefaultFormats + new EnumSerializer[java.time.format.FormatStyle]()
// {"a":"test","enum":"FULL"}
val jsonString = Serialization.write(X("test", FormatStyle.FULL))
Serialization.read[X](jsonString)
Is there a way to make a generic custom serializer that would handle all java enum instances by grabbing their .name() value when serializing to json?
I don't think there is a clean solution because of the type-safety constraints. Still if you are OK with a hacky solution that relies on the fact that Java uses type erasure, here is one that seems to work:
class EnumSerializer() extends Serializer[Enum[_]] {
override def deserialize(implicit format: Formats): PartialFunction[(TypeInfo, JValue), Enum[_]] = {
// using Json4sFakeEnum is a huge HACK here but it seems to work
case (TypeInfo(clazz, _), JString(name)) if classOf[Enum[_]].isAssignableFrom(clazz) => Enum.valueOf[Json4sFakeEnum](clazz.asInstanceOf[Class[Json4sFakeEnum]], name)
}
override def serialize(implicit format: Formats): PartialFunction[Any, JValue] = {
case v: Enum[_] => JString(v.name())
}
}
where Json4sFakeEnum is really a fake enum defined in Java (actually any enum should work but I prefer to make it explicitly fake)
enum Json4sFakeEnum {
}
With such definition an example similar to yours
// first enum I could find
case class X(a: String, enum: java.time.format.FormatStyle)
def js(): Unit = {
implicit val formats = DefaultFormats + new EnumSerializer()
val jsonString = Serialization.write(X("test", FormatStyle.FULL))
println(s"jsonString '$jsonString'")
val r = Serialization.read[X](jsonString)
println(s"res ${r.getClass} '$r'")
}
Produces following output:
jsonString '{"a":"test","enum":"FULL"}'
res class so.Main$X 'X(test,FULL)'
Update or How does it work and why you need Json4sFakeEnum?
There are 2 important things:
Extending Serializer instead of CustomSerializer. This is important because it allows creating a single non-generic instance that can handle all Enum types. This works because the function created by Serializer.deserialize receives TypeInfo as an argument so it can analyze runtime class.
Json4sFakeEnum hack. From the high-level point of view it is enough to have just a Class of the given enum to get all names because they are stored in the Class object. However on the implementation details level the simplest way to access that is to use Enum.valueOf method that has following signature:
public static <T extends Enum<T>> T valueOf(Class<T> enumType, String name)
The unlucky part here is that it has a generic signature and there is a restriction T extends Enum<T>. It means that even though we have proper Class object the best type we know is still Enum[_] and that doesn't fit the self-referencing restriction of extends Enum<T>. On the other hand Java uses type erasure so valueOf is actually compiled to something like
public static Enum<?> valueOf(Class<Enum<?>> enumType, String name)
It means that if we just trick the compiler into allowing us to call valueOf, at the runtime everything will be alright. And this is where Json4sFakeEnum comes on the scene: we just need some known at the compile time specific subclass of Enum to make the valueOf call.
I have a simple type hierarchy like the following:
sealed abstract class Config
object Config {
case class Valid(name: String, traits: List[String]) extends Config
case class Invalid(error: String) extends Config
}
implicit val validFormat = jsonFormatFor(Config.Valid)
implicit val invalidFormat = jsonFormatFor(Config.Invalid)
I also have client code that does the following:
newHttpServer().addHandler("/config", extractConfig)
The extractConfig method performs some computations and returns either a Config.Valid or a Config.Invalid, which the server will automatically convert to json by using the implicit json format objects. My problem is that there is a compiler error because extractConfig returns a Config:
type mismatch; found : Config
required: spray.httpx.marshalling.ToResponseMarshallable
If I change the return type of extractConfig to Config.Valid then the server code compiles because jsonFormatFor(...) supplies the necessary automatic type conversion to make the respose a ToResponseMarshaller (though I admit I don't fully understand this automatic conversion as well, being somewhat new to scala). Is there a simple way to solve this by declaring that any subclass of Config must be a ToResponseMarshaller, given that ToResponseMarshaller is a trait that seems to be supplied via implicit conversions?
If you only have Config.Valid and Config.Invalid it should be sufficient that extractConfig returns an Either[Config.Valid, Config.Invalid]. Then your formats above should work.
Another possibility is to write your own jsonwriter (see this thread from the mailing list).
I am trying to wrap Argonaut (http://argonaut.io) in order to serialize/deserialize JSON in a Scala project. We where using Jerkson before but as it has been discontinued we are looking for an alternative.
This is the basic JSON wrapper
import argonaut._, Argonaut._
object Json {
def Parse[T](input: String): T = {
input.decodeOption[T].get
}
}
When I try and compile this I get the following errors.
could not find implicit value for evidence parameter of type argonaut.DecodeJson[T]
input.decodeOption[T]
^
not enough arguments for method decodeOption: (implicit evidence$6: argonaut.DecodeJson[T]) Option[T].
Unspecified value parameter evidence$6.
input.decodeOption[T]
^
Any suggestions on how to fix this or pointers on what I am doing wrong would be most appreciated.
Also suggestions on alternative JSON frameworks are very welcome.
I'm kind of new to Scala/Java and how generics work there but I have been writing .NET/C# for many years.
In order to make your code work, you will need to redefine the Json object like so:
object Json {
def Parse[T](input: String)(implicit decode:DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
}
The thing you were missing was the implicit DecodeJson instance that the decodeOption function needs in order to figure out how to decode. You also need to define the return type as Option[T] instead of just T. A full example of this all working would look like this:
import argonaut._, Argonaut._
case class User(id:Long, email:String, name:String)
object Json {
def Parse[T](input: String)(implicit decode:DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
}
object JsonTest{
implicit val userDecode = casecodec3(User.apply, User.unapply)("id", "email", "name")
def main(args: Array[String]) {
val json = """{
"id": 1,
"email": "foo#test.com",
"name": "foo bar"
}"""
val userOpt = Json.Parse[User](json)
println(userOpt)
}
}
As far as other Json frameworks, you could look into:
Play Json
json4s
spray-json
Jackson Scala Module
It seems that Argonaut, like pretty much all scala serialization libraries, uses the type class pattern. This sounds like a fancy thing, but actually it just means that when serializing/deserializing an object of type T, it needs you to implicitly pass an instance of another object to which part or all of the process is deferred to.
Specifically, when you do decodeOption[T], you need to have in scope an instance of argonaut.DecodeJson[T] (which decodeOption will use during the deserialization).
What you should do is simply to require this implicit value to be passed to Parse (it will then automatically be passed along to decodeOption:
def Parse[T](input: String)(implicit decoder: argonaut.DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
Scala even provides some syntactic sugar to make the declaration shorter (this is called a "context bound"):
def Parse[T:argonaut.DecodeJson](input: String): Option[T] = {
input.decodeOption[T]
}
Now, when calling Parse, you'll need to bring in scope an implicit value of argonaut.DecodeJson, or the call will fail to compile. Apparently the Argonaut object already defines decoders for many standard types, so for those types you won't have anything special to do.
For other types (such as custom types of yours), you'll have to define decoders and import them.