spray-json and list marshalling - scala

I'm using spray-json to marshal lists of custom objects into JSON. I have the following case class and its JsonProtocol.
case class ElementResponse(name: String, symbol: String, code: String, pkwiu: String, remarks: String, priceNetto: BigDecimal, priceBrutto: BigDecimal, vat: Int, minInStock:Int, maxInStock: Int)
object JollyJsonProtocol extends DefaultJsonProtocol with SprayJsonSupport {
implicit val elementFormat = jsonFormat10(ElementResponse)
}
When I try to put in in a route like this one:
get {
complete {
List(new ElementResponse(...), new ElementResponse(...))
}
}
I get an error saying that:
could not find implicit value for evidence parameter of type spray.httpx.marshalling.Marshaller[List[pl.ftang.scala.polka.rest.ElementResponse]]
Perhaps you know what is the problem?
I'm using Scala 2.10.1 with spray 1.1-M7 and spray-json 1.2.5

This is an old issue, but I feel like giving my 2c. Was looking at similar issues today.
Marcin, it seems your issue was not actually solved (as far as I can read) - why did you accept one answer?
Did you try adding import spray.json.DefaultJsonProtocol._ in places? Those are in charge of making things such as Seqs, Maps, Options and Tuples to work. I would assume this might be the cause of your problem, since it's the List that is not getting converted.

The easiest way to do this, is to make a String from your list or you'll have to deal with ChunckedMessages:
implicit def ListMarshaller[T](implicit m: Marshaller[T]) =
Marshaller[List[T]]{ (value, ctx) =>
value match {
case Nil => ctx.marshalTo(EmptyEntity)
case v => v.map(m(_, ctx)).mkString(",")
}
}
The seconds way is to convert your list into the Stream[ElementResponse] and let spray chunck it for you.
get {
complete {
List(new ElementResponse(...), new ElementResponse(...)).toStream
}
}

You also need to import the format you defined on the route scope:
import JollyJsonProtocol._
get {
complete {
List(new ElementResponse(...), new ElementResponse(...))
}
}

Related

No instance of play.api.libs.json.Format is available for akka.actor.typed.ActorRef[org.knoldus.eventSourcing.UserState.Confirmation]

No instance of play.api.libs.json.Format is available for akka.actor.typed.ActorRef[org.knoldus.eventSourcing.UserState.Confirmation] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
[error] implicit val userCommand: Format[AddUserCommand] = Json.format
I am getting this error even though I have made Implicit instance of Json Format for AddUserCommand.
Here is my code:
trait UserCommand extends CommandSerializable
object AddUserCommand{
implicit val format: Format[AddUserCommand] = Json.format[AddUserCommand]
}
final case class AddUserCommand(user:User, reply: ActorRef[Confirmation]) extends UserCommand
Can anyone please help me with this error and how to solve it?
As Gael noted, you need to provide a Format for ActorRef[Confirmation]. The complication around this is that the natural serialization, using the ActorRefResolver requires that an ExtendedActorSystem be present, which means that the usual approaches to defining a Format in a companion object won't quite work.
Note that because of the way Lagom does dependency injection, this approach doesn't really work in Lagom: commands in Lagom basically can't use Play JSON.
import akka.actor.typed.scaladsl.adapter.ClassicActorSystemOps
import play.api.libs.json._
class PlayJsonActorRefFormat(system: ExtendedActorSystem) {
def reads[A] = new Reads[ActorRef[A]] {
def reads(jsv: JsValue): JsResult[ActorRef[A]] =
jsv match {
case JsString(s) => JsSuccess(ActorRefResolver(system.toTyped).resolveActorRef(s))
case _ => JsError(Seq(JsPath() -> Seq(JsonValidationError(Seq("ActorRefs are strings"))))) // hopefully parenthesized that right...
}
}
def writes[A] = new Writes[ActorRef[A]] {
def writes(a: ActorRef[A]): JsValue = JsString(ActorRefResolver(system.toTyped).toSerializationFormat(a))
}
def format[A] = Format[ActorRef[A]](reads, writes)
}
You can then define a format for AddUserCommand as
object AddUserCommand {
def format(arf: PlayJsonActorRefFormat): Format[AddUserCommand] = {
implicit def arfmt[A]: Format[ActorRef[A]] = arf.format
Json.format[AddUserCommand]
}
}
Since you're presumably using JSON to serialize the messages sent around a cluster (otherwise, the ActorRef shouldn't be leaking out like this), you would then construct an instance of the format in your Akka Serializer implementation.
(NB: I've only done this with Circe, not Play JSON, but the basic approach is common)
The error says that it cannot construct a Format for AddUserCommand because there's no Format for ActorRef[Confirmation].
When using Json.format[X], all the members of the case class X must have a Format defined.
In your case, you probably don't want to define a formatter for this case class (serializing an ActorRef doesn't make much sense) but rather build another case class with data only.
Edit: See Levi's answer on how to provide a formatter for ActorRef if you really want to send out there the actor reference.

Type-safe generic case class updates in Scala

I'm attempting to write some code that tracks changes to a record and applies them at a later date. In a dynamic language I'd do this by simply keeping a log of List[(String, Any)] pairs, and then simply applying these as an update to the original record when I finally decide to commit the changes.
I need to be able to introspect over the updates, so a list of update functions isn't appropriate.
In Scala this is fairly trivial using reflection, however I'd like to implement a type-safe version.
My first attempt was to try with shapeless. This works well if we know specific types.
import shapeless._
import record._
import syntax.singleton._
case class Person(name:String, age:Int)
val bob = Person("Bob", 31)
val gen = LabelledGeneric[Person]
val updated = gen.from( gen.to(bob) + ('age ->> 32) )
// Result: Person("Bob", 32)
However I can't figure out how to make this work generically.
trait Record[T]
def update( ??? ):T
}
Given the way shapeless handles this, I'm not sure if this would even be possible?
If I accept a lot of boilerplate, as a poor mans version I could do something along the lines of the following.
object Contact {
sealed trait Field[T]
case object Name extends Field[String]
case object Age extends Field[Int]
}
// A typeclass would be cleaner, but too verbose for this simple example.
case class Contact(...) extends Record[Contact, Contact.Field] {
def update[T]( field:Contact.Field[T], value:T ) = field match {
case Contact.Name => contact.copy( name = value )
case Contact.Age => contact.copy( age = value )
}
}
However this isn't particularly elegant and requires a lot of boilerplate. I could probably write my own macro to handle this, however it seems like a fairly common thing - is there a way to handle this with Shapeless or a similar macro library already?
How about using the whole instance of the class as an update?
case class Contact(name: String, age: Int)
case class ContactUpdate(name: Option[String] = None, age: Option[Int] = None)
object Contact {
update(target: Contact, delta: ContactUpdate) = Contact(
delta.name.getOrElse(target.name)
target.age.getOrElse(delta.age)
)
}
// also, optionally this:
object ContactUpdate {
apply(name: String) = ContactUpdate(name = Option(name))
apply(age: Int) = ContactUpdate(age = Option(age))
}
I think, if you want the really type-safe solution, this is the cleanest and most readable, and also, possibly the least pain to implement, as you don't need to deal with Records, lenses and individual field descriptors, just ContactUpdate(name="foo") creates an update, and updates.map(Contact.update(target, _)) applies them all in sequence.

How best to keep a cached list of member fields, one each for a family of case classes in Scala

This is a follow up to the following question: Fastest way to get the names of the fields of a case class in Scala
I'm trying to find a simple way to provide fast custom serialization (lets say to a list of tuples of (String, Object), which can be converted into a db row in production or an in memory map in unit testing) to a family of case classes in Scala, and it seems that keeping a cached list of a fields of the class may be a promising way of doing this. However, I'm not sure about the cleanest way to do this. I know I can do something like the following:
case class Playlist(
val id: Option[Long],
val title: Option[String],
val album: Option[String],
val artist: Option[String],
val songId: Option[UUID]) {
def serialize = Playlist.fields.map(f => (f.getName, f.get(this)))
}
object Playlist {
val empty = Playlist(None, None, None, None, None)
val fields = Playlist.empty.getClass.getDeclaredFields.toList
fields foreach { _.setAccessible(true) }
}
There a are a couple of things I don't like about this, however:
I don't want to have to use empty from the companion class just to get a cached list of fields
I don't want to have to declare the serialization logic for each case class for which I want this serialization behavior. There are probably a few ways of getting around this, but I'm not sure of the cleanest way that will give correct behavior (worried about mixing reflection and inheritance)
What's the cleanest way to achieve this in Scala?
I think it would be simplest to keep a cache map of Class[_] -> fields separately from any individual case class, such as in a global singleton with a method serialize(instance). This way you don't have to write any extra code in the classes you wish to serialize.
Another way could be to create a trait to mixin to the case classes' companion objects, with the cached list of fields, and an implicit wrapper class to add the serialize method. You can use an implicit ClassTag to initialize fields:
abstract class MyCompanion[T](implicit ctag: ClassTag[T]) {
private val fields = ctag.runtimeClass.getDeclaredFields.toList
fields foreach { _.setAccessible(true) }
implicit class AddSerializeMethod(obj: T) {
def serialize = fields.map(f => (f.getName, f.get(obj)))
}
}
case class C(...) { ... }
object C extends MyCompanion[C]
Unfortunately, it seems you can't make AddSerializeMethod a value class this way.

getClass out of String and using within generics

Because I'll get String's from my websocket, I must convert the String to an actual type. Is it possible to do something like that?:
def createThing(cls: String) = {
List[cls.getClass]() // or create actors or something like that
}
createThing("Int") // should produce List[Int]
createThing("Double") // should produce List[Double]
Is it possible to achieve this? I'm new to with reflection, therefore I could not find a solution.
No. The static type can't depend on runtime data in the way you want. E.g. should
createThing("Foo")
fail to compile if class Foo is not defined? However, you can do a lot of things without this. If you specify your problem better in a separate question, you may get answers.
You can solve this without mucking about with reflection. A minimalist class hierarchy can solve the problem quite effectively. Here is an example:
trait Message
case class StringList(strings: List[String]) extends Message
case class IntList(ints: List[Int]) extends Message
object Create {
def createThing(cls: String): Option[Message] = cls match {
case "strings" => Some(StringList(List[String]("a","b")))
case "ints" => Some(IntList(List[Int](3,4,5)))
case _ => None
}
}
object Main extends App {
val thing = Create.createThing("ints")
thing match {
case Some(a: StringList) => println(s"It was a StringList containing ${a.strings}")
case Some(b: IntList) => println(s"It was an IntList containing ${b.ints}")
case _ => println("Nothing we know about")
}
}

Define custom serialization with Casbah / Salat - or delegate serialization to member?

I'm in the process of learning Scala for a new project having come from Rails. I've defined a type that is going to be used in a number of my models which can basically be thought of as collection of 'attributes'. It's basically just a wrapper for a hashmap that delegates most of its responsibilities to it:
case class Description(attributes: Map[String, String]) {
override def hashCode: Int = attributes.hashCode
override def equals(other: Any) = other match {
case that: Description => this.attributes == that.attributes
case _ => false
}
}
So I would then define a model class with a Description, something like:
case class Person(val name: String, val description: Description)
However, when I persist a Person with a SalatDAO I end up with a document that looks like this:
{
name : "Russell",
description:
{
attributes:
{
hair: "brown",
favourite_color: "blue"
}
}
}
When in actual fact I don't need the nesting of the attributes tag in the description tag - what I actually want is this:
{
name : "Russell",
description:
{
hair: "brown",
favourite_color: "blue"
}
}
I haven't tried, but I reckon I could get that to work if I made Description extend a Map rather than contain one, but I'd rather not, because a Description isn't a type of Map, it's something which has some of the behaviour of a Map as well as other behaviour of its own I'm going to add later. Composition over inheritance and so on.
So my question is, how can I tell Salat (or Casbah, I'm actually a bit unclear as to which is doing the conversion as I've only just started using them) how to serialize and deserialize the Description class? In the casbah tutorial here it says:
It is also possible to create your own custom type serializers and
deserializers. See Custom Serializers and Deserializers.
But this page doesn't seem to exist. Or am I going about it the wrong way? Is there actually a really simple way to indicate this is what I want to happen, an annotation or something? Or can I simply delegate the serialization to the attributes map in some way?
EDIT: After having a look at the source for the JodaTime conversion helper I've tried the following but have had no luck getting it to work yet:
import org.bson.{ BSON, Transformer }
import com.mongodb.casbah.commons.conversions.MongoConversionHelper
object RegisterCustomConversionHelpers extends Serializers
with Deserializers {
def apply() = {
super.register()
}
}
trait Serializers extends MongoConversionHelper
with DescriptionSerializer {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait Deserializers extends MongoConversionHelper {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait DescriptionSerializer extends MongoConversionHelper {
private val transformer = new Transformer {
def transform(o: AnyRef): AnyRef = o match {
case d: Description => d.attributes.asInstanceOf[AnyRef]
case _ => o
}
}
override def register() = {
BSON.addEncodingHook(classOf[Description], transformer)
super.register()
}
}
When I call RegisterCustomConversionHelpers() then save a Person I don't get any errors, it just has no effect, saving the document the same way as ever. This also seems like quite a lot to have to do for what I want.
Salat maintainer here.
I don't understand the value of Description as a wrapper here. It wraps a map of attributes, overrides the default equals and hashcode impl of a case class - which seems unnecessary since the impl is delegated to the map anyhow and that is exactly what the case class does anyway - and introduces an additional layer of indirection to the serialized object.
Have you considered just:
case class Person(val name: String, val description: Map[String, String])
This will do exactly what you want out of box.
In another situation I would recommend a simple type alias but unfortunately Salat can't support type aliases right now due to some issues with how they are depicted in pickled Scala signatures.
(You probably omitted this from your example from brevity, but it is best practice for your Mongo model to have an _id field - if you don't, the Mongo Java driver will supply one for you)
There is a working example of a custom BSON hook in the salat-core test package (it handles java.net.URL). It could be that your hook is not working simply because you are not registering it in the right place? But still, I would recommend getting rid of Description unless it is adding some value that is not evident from your example above.
Based on #prasinous' answer I decided this wasn't going to be that easy so I've changed my design a bit to the following, which pretty much gets me what I want. Rather than persisting the Description as a field I persist a vanilla map then mix in a Described trait to the model classes I want to have a description, which automatically converts the map to Description when the object is created. Would appreciate it if anyone can point out any obvious problems to this approach or any suggestions for improvement.
class Description(val attributes: Map[String, String]){
//rest of class omitted
}
trait Described {
val attributes: Map[String, String]
val description = new Description(attributes)
}
case class Person(name: String, attributes: Map[String, String]) extends Described