Serializing case class with trait mixin using json4s - scala

I've got a case class Game which I have no trouble serializing/deserializing using json4s.
case class Game(name: String,publisher: String,website: String, gameType: GameType.Value)
In my app I use mapperdao as my ORM. Because Game uses a Surrogate Id I do not have id has part of its constructor.
However, when mapperdao returns an entity from the DB it supplies the id of the persisted object using a trait.
Game with SurrogateIntId
The code for the trait is
trait SurrogateIntId extends DeclaredIds[Int]
{
def id: Int
}
trait DeclaredIds[ID] extends Persisted
trait Persisted
{
#transient
private var mapperDaoVM: ValuesMap = null
#transient
private var mapperDaoDetails: PersistedDetails = null
private[mapperdao] def mapperDaoPersistedDetails = mapperDaoDetails
private[mapperdao] def mapperDaoValuesMap = mapperDaoVM
private[mapperdao] def mapperDaoInit(vm: ValuesMap, details: PersistedDetails) {
mapperDaoVM = vm
mapperDaoDetails = details
}
.....
}
When I try to serialize Game with SurrogateIntId I get empty parenthesis returned, I assume this is because json4s doesn't know how to deal with the attached trait.
I need a way to serialize game with only id added to its properties , and almost as importantly a way to do this for any T with SurrogateIntId as I use these for all of my domain objects.
Can anyone help me out?

So this is an extremely specific solution since the origin of my problem comes from the way mapperDao returns DOs, however it may be helpful for general use since I'm delving into custom serializers in json4s.
The full discussion on this problem can be found on the mapperDao google group.
First, I found that calling copy() on any persisted Entity(returned from mapperDao) returned the clean copy(just case class) of my DO -- which is then serializable by json4s. However I did not want to have to remember to call copy() any time I wanted to serialize a DO or deal with mapping lists, etc. as this would be unwieldy and prone to errors.
So, I created a CustomSerializer that wraps around the returned Entity(case class DO + traits as an object) and gleans the class from generic type with an implicit manifest. Using this approach I then pattern match my domain objects to determine what was passed in and then use Extraction.decompose(myDO.copy()) to serialize and return the clean DO.
// Entity[Int, Persisted, Class[T]] is how my DOs are returned by mapperDao
class EntitySerializer[T: Manifest] extends CustomSerializer[Entity[Int, Persisted, Class[T]]](formats =>(
{PartialFunction.empty} //This PF is for extracting from JSON and not needed
,{
case g: Game => //Each type is one of my DOs
implicit val formats: Formats = DefaultFormats //include primitive formats for serialization
Extraction.decompose(g.copy()) //get plain DO and then serialize with json4s
case u : User =>
implicit val formats: Formats = DefaultFormats + new LinkObjectEntitySerializer //See below for explanation on LinkObject
Extraction.decompose(u.copy())
case t : Team =>
implicit val formats: Formats = DefaultFormats + new LinkObjectEntitySerializer
Extraction.decompose(t.copy())
...
}
The only need for a separate serializer is in the event that you have non-primitives as parameters of a case class being serialized because the serializer can't use itself to serialize. In this case you create a serializer for each basic class(IE one with only primitives) and then include it into the next serializer with objects that depend on those basic classes.
class LinkObjectEntitySerializer[T: Manifest] extends CustomSerializer[Entity[Int, Persisted, Class[T]]](formats =>(
{PartialFunction.empty},{
//Team and User have Set[TeamUser] parameters, need to define this "dependency"
//so it can be included in formats
case tu: TeamUser =>
implicit val formats: Formats = DefaultFormats
("Team" -> //Using custom-built representation of object
("name" -> tu.team.name) ~
("id" -> tu.team.id) ~
("resource" -> "/team/") ~
("isCaptain" -> tu.isCaptain)) ~
("User" ->
("name" -> tu.user.globalHandle) ~
("id" -> tu.user.id) ~
("resource" -> "/user/") ~
("isCaptain" -> tu.isCaptain))
}
))
This solution is hardly satisfying. Eventually I will need to replace mapperDao or json4s(or both) to find a simpler solution. However, for now, it seems to be the fix with the least amount of overhead.

Related

Dynamically checking subclass relationship in Scala 3

I am trying to port a solution for DomainEventHandlers and -Dispatcher from PHP8 to Scala 3. Handlers should specify a list of events they can handle (in a type-safe way, preferably, by their classes). Handlers are dynamically registered with the Dispatcher, which should aggregate a map from the elements of the lists from each Handler to a List of Handlers for those events.
When an event is raised with the Dispatcher, it should check the class of the current event against the keys from the map, and pass the event to each Handler in each list of Handlers for a key if and only if the event's class is identical to or a subclass of the class specified by the key.
In dynamically typed OOP languages like PHP8, this is easy - a Handler stores a list of class-names, which can be reified simply by [ClassName]::class, then the Dispatcher gets the event's class via $event::class and performs an is_a-check for each HashMap-key, which checks both exact match and subclass-relationship.
In Scala 3, I can't seem to find a good way to do this. Working with underlying Java-reflections via getClass or Class[?] produces problems due to the mismatch between the Scala and Java type-systems (specifically, trailing $ being either present or not). In Scala 2, Tags would probably have been the way to go - but Scala 3 reflection is a different beast, and I have not found a way to utilize it to implement the above, and would appreciate advice.
Concretely, let's say we have
trait DomainEvent[D1 <: Serializable, D2 <: Serializable, A <: Aggregate]
extends Event[D1, D2]:
type AggregateType = A
val aggregateIdentifier: (String, UUID)
def applyAsPatch(aggregate: AggregateType): AggregateType
trait DomainEventHandler:
val handles: List[???]
def handle(event: DomainEvent[?, ?, ?]): ZIO[Any, Throwable, Unit]
object DomainEventDispatcher:
val registeredHandlers: scala.collection.mutable.Map[???, List[DomainEventHandler]] =
scala.collection.mutable.Map()
def registerHandler(handler: DomainEventHandler): Unit = ???
def raiseEvent(event: DomainEvent[?, ?, ?]): ZIO[Any, Throwable, Unit] = ???
I am unsure what to use in place of ??? in the DomainEventHandler's List and the Dispatcher's Map - the registerHandler and raiseEvent-implementations will follow from that.
Well, if your concrete event classes that you match aren't parametrized, it's pretty simple:
trait Event[A]
case class IntEvent(x: Int) extends Event[Int]
case class StringEvent(x: String) extends Event[String]
object Dispatcher {
var handlers = List.empty[PartialFunction[Event[_], String]]
def register(h: PartialFunction[Event[_], String]): Unit = { handlers = h :: handlers }
def dispatch(event: Event[_]) = handlers.flatMap { _.lift(event) }
}
Dispatcher.register { case e: IntEvent => s"Handled $e" }
Dispatcher.register {
case e: IntEvent => s"Handled ${e.x}"
case e: StringEvent => s"Handled ${e.x}"
}
Dispatcher.dispatch(new IntEvent()) // List(Handled 1, Handled IntEvent(1))
Dispatcher.dispatch(new StringEvent("foo")) // List(Handled foo)
But if you want to match on things like Event[Int], that makes things significantly more difficult. I wasn't able to find a good way to do it (though, I am by no means an expert in scala 3 features).
Not sure why they dropped ClassTag support ... I am taking it as a sign that matching on type parameters like this is no longer considered a good practice, and the "proper" solution to your problem is now naming all classes you want to match without type parameters.

Advice on Case Classes/Objects/Matching

I am trying to model (in my Scala application) a list of options presented in my web page and am scratching my head coming up with a solution for mapping a String value posted from the client to it's corresponding object in the backend.
eg. Let's say it is a list of Animals and the user can choose 1 which gets posted to the backend.
Animals
Polar Bear
Rabbit
Great White Shark
When a request comes in, I want to convert the Great White Shark String to an Animal but not sure on how best to match the
String to the appropriate type in the backend.
So far I have this.
sealed abstract class Animal(val name: String)
case object GreatWhite extends Animal("Great White Shark")
case object PolarBear extends Animal("Polar Bear")
Which allows me to do this to match the String from the UI to it's corresponding case object in my Scala application.
def matcher(animal: String) = animal match {
case GreatWhite.name => GreatWhite
case PolarBear.name => PolarBear
}
Problem
If the List of Animal's grows long however, this matcher is going to be very cumbersome since I need to have a case expression for every Animal.
I would much appreciate any experienced Scala guys giving me a pointer on a more elegant solution.
It's looks like what you need is simply have a hash table of String to Animal.
Such approach gives you ability to get result in constant time O(1) even with extensivly growing list.
val mapping = Map[String, Animal]("Rabbit" -> Rabbit, "Polar Bear" -> PolarBear /* ... */ )
// matcher
mapping.get(animal)
UPD.
Some useful comments below.
sealed abstract class Animal(val name: String)
case object GreatWhite extends Animal("Great White Shark")
case object PolarBear extends Animal("Polar Bear")
val mapping: Map[String, Animal] = Seq(GreatWhite, PolarBear).map(x => x.name -> x).toMap
mapping
Have you looked at Enums? If they are usable for you, Enums have a .withName method http://yefremov.net/blog/scala-enum-by-name/

Using overloaded constructors from the superclass

I'm writing a message parser. Suppose I have a superclass Message with two auxiliary constructors, one that accepts String raw messages and one that accepts a Map with datafields mapped out in key-value pairs.
class Message {
def this(s: String)
def this(m: Map[String, String])
def toRaw = { ... } # call third party lib to return the generated msg
def map # call third party lib to return the parsed message
def something1 # something common for all messages which would be overriden in child classes
def something2 # something common for all messages which would be overriden in child classes
...
}
There's good reason to do this as the library that does parsing/generating is kind of awkward and removing the complexity of interfacing with it into a separate class makes sense, the child class would look something like this:
class SomeMessage extends Message {
def something1 # ...
def something2 # ...
}
and the idea is to use the overloaded constructors in the child class, for example:
val msg = new SomeMessage(rawMessage) # or
val msg = new SomeMessage("fld1" -> ".....", "fld2" -> "....")
# and then be able to call
msg.something1
msg.something2 # ...
However, the way auxiliary constructors and inheritance seem to behave in Scala this pattern has proven to be pretty challenging, and the simplest solution I found so far is to create a method called constructMe, which does the work of the constructors in the above case:
val msg = new SomeMessage
msg.constructMe(rawMessage) # or
msg.constructMe("fld1" -> ".....", "fld2" -> "....")
which seems crazy to need a method called constructMe.
So, the question:
is there a way to structure the code so to simply use the overloaded constructors from the superclass? For example:
val msg = new SomeMessage(rawMessage) # or
val msg = new SomeMessage("fld1" -> ".....", "fld2" -> "....")
or am I simply approaching the problem the wrong way?
Unless I'm missing something, you are calling the constructor like this:
val msg = new SomeMessage(rawMessage)
But the Message class doesn't not take a parameter, your class should be defined so:
class Message(val message: String) {
def this(m: Map[String, String]) = this("some value from mapping")
}
Also note that the constructor in scala must call the primary constructor as first action, see this question for more info.
And then the class extending the Message class should be like this:
class SomeMessage(val someString: String) extends Message(someString) {
def this(m: Map[String, String]) = this("this is a SomeMessage")
}
Note that the constructor needs a code block otherwise your code won't compile, you can't have a definition like def this(someString: String) without providing the implementation.
Edit:
To be honest I don't quite get why you want to use Maps in your architecture, your class main point it to contain a String, having to do with complex types in constructors can lead to problems. Let's say you have some class which can take a Map[String, String] as a constructor parameter, what will you do with it? As I said a constructor must call himself as first instruction, what you could is something like this:
class A(someString: String) = {
def this(map: Map[String, String]) = this(map.toString)
}
And that's it, the restrictions in scala don't allow you to do anything more, you would want to do some validation, for example let's say you want to take always the second element in the map, this could throw exceptions since the user is not forced to provide a map with more than one value, he's not even forced to provide a filled map unless you start filling your class with requires.
In your case I probably would leave String as class parameter or maybe a List[String] where you can call mkString or toString.
Anyway if you are satisfied calling map.toString you have to give both constructor implementation to parent and child class, this is one of scala constructor restrictions (in Java you could approach the problem in a different way), I hope somebody will prove me wrong, but as far as I know there's no other way to do it.
As a side note, I personally find this kind of restriction to be correct (most of the time) since the force you to structure your code to be more rigorous and have a better architecture, think about the fact that allowing people to do whatever they want in a constructor (like in java) obfuscate their true purpose, that is return a new instance of a class.

Designing serialization library in Scala with type classes

I have system where I need to serialize different kinds of objects to json and xml. Some of them are Lift MetaRecords, some are case classes. I wanted to use type classes and create something like:
trait Serializable[T] {
serialize[T](obj: T): T
}
And usual implementations for json, xml and open for extension.
Problem I'm facing now is serialization itself. Currently there are different contexts in which objects are serialized. Imagine news feed system. There are three objects: User, Post (feed element) and Photo. Those objects have some properties and can reference each other. Now in same cases I want to serialize object alone (user settings, preferences, etc.) in other cases I need other objects to be serialized as well ie. Feed: List[Post] + related photos. In order to do that I need to provide referenced objects.
My current implementation is bloated with optional parametered functions.
def feedAsJson(post: MPost, grp: Option[PrivateGroup], commentsBox: Option[List[MPostComment]] = Empty): JObject
I thought about implementing some kind of context solution. Overload feedAsJson with implicit context parameter that will provide necessary data. I don't know how I'd like to implement it yet as it touches database maybe with cake pattern. Any suggestions very appreciated.
Can't you put the implicits in scope that will create the right kind of serializers that you need? Something to that effect:
def doNothingSerializer[T]: Serializable[T] = ???
implicit def mpostToJson(implicit pgs:Serializable[PrivateGroup]],
cmts:Serializable[List[MPostComment]]) =
new Serializable[MPost] {
def serialize(mpost: MPost): JObject = {
val privateGroupJSon = pgs.serialize(mpost.privateGroup)
// make the mpost json with privateGroupJSon which would be empty
???
}
}
// later where you need to serialize without the inner content:
implicit val privateGroupToJson = doNothingSerializer[PrivateGroup]
implicit val mpostCommentsToJson = doNothingSerializer[List[MPostComment]]
implicitly[Serializable[MPost]].serialize(mpost)
You would need to define default serializable instances in a trait that is then inherited (so that low priority implicits are in scope).
Note that I'm assuming that the trait for Serializable is:
trait Serializable[T] {
def serialize(t: T): JObject
}
(no [T] method type argument and returns a JObject)
Maybe "Scala Pickling" might help you:
http://lampwww.epfl.ch/~hmiller/pickling
I just watched the presentation.

Scala "update" immutable object best practices

With a mutable object I can write something like
var user = DAO.getUser(id)
user.name = "John"
user.email ="john#doe.com"
// logic on user
If user is immutable then I need to clone\copy it on every change operation.
I know a few ways to perform this
case class copy
method (like changeName) that creates a new object with the new property
What is the best practice?
And one more question. Is there any existing technique to get "changes" relative to the original object(for example to generate update statement)?
Both ways you've mentioned belongs to functional and OO paradigms respectively. If you prefer functional decomposition with abstract data type, which, in Scala, is represented by case classes, then choose copy method. Using mutators is not a good practice in my option, cause that will pull you back to Java/C#/C++ way of life.
On the other hand making ADT case class like
case class Person(name: String, age: String)
is more consise then:
class Person(_name: String, _age: String) {
var name = _name
var age = _a
def changeName(newName: String): Unit = { name = newName }
// ... and so on
}
(not the best imperative code, can be shorter, but clear).
Of cause there is another way with mutators, just to return a new object on each call:
class Person(val name: String,
val age: String) {
def changeName(newName: String): Unit = new Person(newName, age)
// ... and so on
}
But still case class way is more consise.
And if you go futher, to concurrent/parallel programming, you'll see that functional consept with immutable value are much better, then tring to guess in what state your object currently are.
Update
Thanks to the senia, forgot to mention two things.
Lenses
At the most basic level, lenses are sort of getters and setters for immutable data and looks like this:
case class Lens[A,B](get: A => B, set: (A,B) => A) {
def apply(a: A) = get(a)
// ...
}
That is it. A lens is a an object that contains two functions: get and set. get takes an A and returns a B. set takes an A and B and returns a new A. It’s easy to see that the type B is a value contained in A. When we pass an instance to get we return that value. When we pass an A and a B to set we update the value B in A and return a new A reflecting the change. For convenience the get is aliased to apply. There is a good intro to Scalaz Lens case class
Records
This one, ofcause, comes from the shapeless library and called Records. An implementation of extensible records modelled as HLists of associations. Keys are encoded using singleton types and fully determine the types of their corresponding values (ex from github):
object author extends Field[String]
object title extends Field[String]
object price extends Field[Double]
object inPrint extends Field[Boolean]
val book =
(author -> "Benjamin Pierce") ::
(title -> "Types and Programming Languages") ::
(price -> 44.11) ::
HNil
// Read price field
val currentPrice = book.get(price) // Inferred type is Double
currentPrice == 44.11
// Update price field, relying on static type of currentPrice
val updated = book + (price -> (currentPrice+2.0))
// Add a new field
val extended = updated + (inPrint -> true)