Define custom serialization with Casbah / Salat - or delegate serialization to member? - scala

I'm in the process of learning Scala for a new project having come from Rails. I've defined a type that is going to be used in a number of my models which can basically be thought of as collection of 'attributes'. It's basically just a wrapper for a hashmap that delegates most of its responsibilities to it:
case class Description(attributes: Map[String, String]) {
override def hashCode: Int = attributes.hashCode
override def equals(other: Any) = other match {
case that: Description => this.attributes == that.attributes
case _ => false
}
}
So I would then define a model class with a Description, something like:
case class Person(val name: String, val description: Description)
However, when I persist a Person with a SalatDAO I end up with a document that looks like this:
{
name : "Russell",
description:
{
attributes:
{
hair: "brown",
favourite_color: "blue"
}
}
}
When in actual fact I don't need the nesting of the attributes tag in the description tag - what I actually want is this:
{
name : "Russell",
description:
{
hair: "brown",
favourite_color: "blue"
}
}
I haven't tried, but I reckon I could get that to work if I made Description extend a Map rather than contain one, but I'd rather not, because a Description isn't a type of Map, it's something which has some of the behaviour of a Map as well as other behaviour of its own I'm going to add later. Composition over inheritance and so on.
So my question is, how can I tell Salat (or Casbah, I'm actually a bit unclear as to which is doing the conversion as I've only just started using them) how to serialize and deserialize the Description class? In the casbah tutorial here it says:
It is also possible to create your own custom type serializers and
deserializers. See Custom Serializers and Deserializers.
But this page doesn't seem to exist. Or am I going about it the wrong way? Is there actually a really simple way to indicate this is what I want to happen, an annotation or something? Or can I simply delegate the serialization to the attributes map in some way?
EDIT: After having a look at the source for the JodaTime conversion helper I've tried the following but have had no luck getting it to work yet:
import org.bson.{ BSON, Transformer }
import com.mongodb.casbah.commons.conversions.MongoConversionHelper
object RegisterCustomConversionHelpers extends Serializers
with Deserializers {
def apply() = {
super.register()
}
}
trait Serializers extends MongoConversionHelper
with DescriptionSerializer {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait Deserializers extends MongoConversionHelper {
override def register() = {
super.register()
}
override def unregister() = {
super.unregister()
}
}
trait DescriptionSerializer extends MongoConversionHelper {
private val transformer = new Transformer {
def transform(o: AnyRef): AnyRef = o match {
case d: Description => d.attributes.asInstanceOf[AnyRef]
case _ => o
}
}
override def register() = {
BSON.addEncodingHook(classOf[Description], transformer)
super.register()
}
}
When I call RegisterCustomConversionHelpers() then save a Person I don't get any errors, it just has no effect, saving the document the same way as ever. This also seems like quite a lot to have to do for what I want.

Salat maintainer here.
I don't understand the value of Description as a wrapper here. It wraps a map of attributes, overrides the default equals and hashcode impl of a case class - which seems unnecessary since the impl is delegated to the map anyhow and that is exactly what the case class does anyway - and introduces an additional layer of indirection to the serialized object.
Have you considered just:
case class Person(val name: String, val description: Map[String, String])
This will do exactly what you want out of box.
In another situation I would recommend a simple type alias but unfortunately Salat can't support type aliases right now due to some issues with how they are depicted in pickled Scala signatures.
(You probably omitted this from your example from brevity, but it is best practice for your Mongo model to have an _id field - if you don't, the Mongo Java driver will supply one for you)
There is a working example of a custom BSON hook in the salat-core test package (it handles java.net.URL). It could be that your hook is not working simply because you are not registering it in the right place? But still, I would recommend getting rid of Description unless it is adding some value that is not evident from your example above.

Based on #prasinous' answer I decided this wasn't going to be that easy so I've changed my design a bit to the following, which pretty much gets me what I want. Rather than persisting the Description as a field I persist a vanilla map then mix in a Described trait to the model classes I want to have a description, which automatically converts the map to Description when the object is created. Would appreciate it if anyone can point out any obvious problems to this approach or any suggestions for improvement.
class Description(val attributes: Map[String, String]){
//rest of class omitted
}
trait Described {
val attributes: Map[String, String]
val description = new Description(attributes)
}
case class Person(name: String, attributes: Map[String, String]) extends Described

Related

Scala Type Classes Understanding Interface Syntax

I'm was reading about cats and I encountered the following code snippet which is about serializing objects to JSON!
It starts with a trait like this:
trait JsonWriter[A] {
def write(value: A): Json
}
After this, there are some instances of our domain object:
final case class Person(name: String, email: String)
object JsonWriterInstances {
implicit val stringWriter: JsonWriter[String] =
new JsonWriter[String] {
def write(value: String): Json =
JsString(value)
}
implicit val personWriter: JsonWriter[Person] =
new JsonWriter[Person] {
def write(value: Person): Json =
JsObject(Map(
"name" -> JsString(value.name),
"email" -> JsString(value.email)
))
}
// etc...
}
So far so good! I can then use this like this:
import JsonWriterInstances._
Json.toJson(Person("Dave", "dave#example.com"))
Later on I come across something called the interface syntax, which uses extension methods to extend existing types with interface methods like below:
object JsonSyntax {
implicit class JsonWriterOps[A](value: A) {
def toJson(implicit w: JsonWriter[A]): Json =
w.write(value)
}
}
This then simplifies the call to serializing a Person as:
import JsonWriterInstances._
import JsonSyntax._
Person("Dave", "dave#example.com").toJson
What I don't understand is that how is the Person boxed into JsonWriterOps such that I can directly call the toJson as though toJson was defined in the Person case class itself. I like this magic, but I fail to understand this one last step about the JsonWriterOps. So what is the idea behind this interface syntax and how does this work? Any help?
This is actually a standard Scala feature, since JsonWriterOps is marked implicit and is in scope, the compiler can apply it at compilation-time when needed.
Hence scalac will do the following transformations:
Person("Dave", "dave#example.com").toJson
new JsonWriterOps(Person("Dave", "dave#example.com")).toJson
new JsonWriterOps[Person](Person("Dave", "dave#example.com")).toJson
Side note:
It's much more efficient to implicit classes as value classes like this:
implicit class JsonWriterOps[A](value: A) extends AnyVal
This makes the compiler also optimize away the new object construction, if possible, compiling the whole implicit conversion + method call to a simple function call.

Guice and Injection within Objects or Case Classes

This is probably not a good design I guess, but I have inherited it nonetheless.
The code pattern is as follows
case class XYZ(id: String, name: String) {
import XYZ._
def someUserDef: String = {
preprocess(id, name)
}
}
object XYZ {
val someConfiguration = GlobalContext.injector.instanceOf[ApplicationConfigurationWrapper].someProperty
def preprocess(id: String, name: String) = {
// Do some stuff using `someConfiguration` value.
}
}
The GlobalContext here is a pattern I used from this answer here.
Question:
The issue at core is how to cleanly inject dependencies into objects or case classes. As depicted here, the case class depends on some configuration level item.
There are drawbacks, imo, to how this approach causes problems in writing clean specs.
It seems to me that it is better to remove this dependency/logic and move it to an external class or object similar to GlobalContext.
Is there a better approach than this?

How to design immutable model classes when using inheritance

I'm having trouble finding an elegant way of designing a some simple classes to represent HTTP messages in Scala.
Say I have something like this:
abstract class HttpMessage(headers: List[String]) {
def addHeader(header: String) = ???
}
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers)
new HttpRequest("/", List("foo")).addHeader("bar")
How can I make the addHeader method return a copy of itself with the new header added? (and keep the current value of path as well)
Thanks,
Rob.
It is annoying but the solution to implement your required pattern is not trivial.
The first point to notice is that if you want to preserve your subclass type, you need to add a type parameter. Without this, you are not able to specify an unknown return type in HttpMessage
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String):X
}
Then you can implement the method in your concrete subclasses where you will have to specify the value of X:
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers){
type X = HttpRequest
def addHeader(header: String):HttpRequest = new HttpRequest(path, headers :+header)
}
A better, more scalable solution is to use implicit for the purpose.
trait HeaderAdder[T<:HttpMessage]{
def addHeader(httpMessage:T, header:String):T
}
and now you can define your method on the HttpMessage class like the following:
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String)(implicit headerAdder:HeaderAdder[X]):X = headerAdder.add(this,header) }
}
This latest approach is based on the typeclass concept and scales much better than inheritance. The idea is that you are not forced to have a valid HeaderAdder[T] for every T in your hierarchy, and if you try to call the method on a class for which no implicit is available in scope, you will get a compile time error.
This is great, because it prevents you to have to implement addHeader = sys.error("This is not supported")
for certain classes in the hierarchy when it becomes "dirty" or to refactor it to avoid it becomes "dirty".
The best way to manage implicit is to put them in a trait like the following:
trait HeaderAdders {
implicit val httpRequestHeaderAdder:HeaderAdder[HttpRequest] = new HeaderAdder[HttpRequest] { ... }
implicit val httpRequestHeaderAdder:HeaderAdder[HttpWhat] = new HeaderAdder[HttpWhat] { ... }
}
and then you provide also an object, in case user can't mix it (for example if you have frameworks that investigate through reflection properties of the object, you don't want extra properties to be added to your current instance) (http://www.artima.com/scalazine/articles/selfless_trait_pattern.html)
object HeaderAdders extends HeaderAdders
So for example you can write things such as
// mixing example
class MyTest extends HeaderAdders // who cares about having two extra value in the object
// import example
import HeaderAdders._
class MyDomainClass // implicits are in scope, but not mixed inside MyDomainClass, so reflection from Hiberante will still work correctly
By the way, this design problem is the same of Scala collections, with the only difference that your HttpMessage is TraversableLike. Have a look to this question Calling map on a parallel collection via a reference to an ancestor type

How to update a mongo record using Rogue with MongoCaseClassField when case class contains a scala Enumeration

I am upgrading existing code from Rogue 1.1.8 to 2.0.0 and lift-mongodb-record from 2.4-M5 to 2.5.
I'm having difficulty writing MongoCaseClassField that contains a scala enum, that I really could use some help with.
For example,
object MyEnum extends Enumeration {
type MyEnum = Value
val A = Value(0)
val B = Value(1)
}
case class MyCaseClass(name: String, value: MyEnum.MyEnum)
class MyMongo extends MongoRecord[MyMongo] with StringPk[MyMongo] {
def meta = MyMongo
class MongoCaseClassFieldWithMyEnum[OwnerType <: net.liftweb.record.Record[OwnerType], CaseType](rec : OwnerType)(implicit mf : Manifest[CaseType]) extends MongoCaseClassField[OwnerType, CaseType](rec)(mf) {
override def formats = super.formats + new EnumSerializer(MyEnum)
}
object myCaseClass extends MongoCaseClassFieldWithMyEnum[MyMongo, MyCaseClass](this)
/// ...
}
When we try to write to this field, we get the following error:
could not find implicit value for evidence parameter of type
com.foursquare.rogue.BSONType[MyCaseClass]
.and(_.myCaseClass setTo myCaseClass)
We used to have this working in Rogue 1.1.8, by using our own version of the MongoCaseClassField, which made the #formats method overridable. But that feature was included into lift-mongodb-record in 2.5-RC6, so we thought this should just work now?
Answer coming from : http://grokbase.com/t/gg/rogue-users/1367nscf80/how-to-update-a-record-with-mongocaseclassfield-when-case-class-contains-a-scala-enumeration#20130612woc3x7utvaoacu7tv7lzn4sr2q
But more convenient directly here on StackOverFlow:
Sorry, I should have chimed in here sooner.
One of the long-standing problems with Rogue was that it was too easy to
accidentally make a field that was not serializable as BSON, and have it
fail at runtime (when you try to add that value to a DBObject) rather than
at compile time.
I introduced the BSONType type class to try to address this. The upside is
it catches BSON errors at compile time. The downside is you need to make a
choice when it comes to case classes.
If you want to do this the "correct" way, define your case class plus a
BSONType "witness" for that case class. To define a BSONType witness, you
need to provide serialization from that type to a BSON type. Example:
case class TestCC(v: Int)
implicit object TestCCIsBSONType extends BSONType[TestCC] {
override def asBSONObject(v: TestCC): AnyRef = {
// Create a BSON object
val ret = new BasicBSONObject
// Serialize all the fields of the case class
ret.put("v", v.v)
ret
}
}
That said, this can be quite burdensome if you're doing it for each case
class. Your second option is to define a generic witness that works for any
case class, if you have a generic serialization scheme:
implicit def CaseClassesAreBSONTypes[CC <: CaseClass]: BSONType[CC] =
new BSONType[CC] {
override def asBSONObject(v: CC): AnyRef = {
// your generic serialization code here, maybe involving formats
}
}
Hope this helps,

How to share behavior over case classes in scala

Implementing my domain model in scala using case classes I got
abstract class Entity {
val _id: Option[BSONObjectID]
val version: Option[BSONLong]
}
and several case classes defining the different entities like
case class Person (
_id: Option[BSONObjectID],
name: String,
version: Option[BSONLong]
) extends Entity
What I need is a way to set the _id and version later on from a generic method which operates on an Entity because I have to share this behavior over all Entities and want to avoid writing it down hundreds of times ;-). I would love to be able to
def createID(entity: Entity): Entity = {
entity.copy(_id = ..., version = ...)
}
...but of course this does not compile since an Entity has no copy-method. It is generated for each single case class by the compiler...
What is the best way to achieve this in scala?
To prevent somebody asking: I have to use case classes since this is what the third-party-library is extracting for me from the requests I get and the case class instances are what is serialized back to BSON / MongoDB later on...
Indeed one can find a way to implement something like this at
Create common trait for all case classes supporting copy(id=newId) method
but since it is quite complicated for my use case I would prefer just to create two new classes
class MongoId(var id : BSONObjectID = null) {
def generate = {
id = BSONObjectID.generate
}
}
class MongoVersion(var version: Long = 0) {
def update = {
version = System.currentTimeMillis
}
}
and implemented the shared behavior regarding these fields there. Of course you have to change the definition of your base class accordingly:
abstract class Entity {
def _id: MongoId
def version: MongoVersion
}
To make it clear: This works only if the behavior you want to share over several case classes does only affect (in my case changes) one attribute ...
Would implementing a trait work?
trait MongoIdHandler {
def createId(entity : Entity) : Option[BSONObjectID] = { ..}
def setVersion(version : String) : Unit = { ..}
}
case class Person (..) with MongoIdHandler ..
If any of the instances require specialized versions of the id generator they can override the 'default' impl provided by the trait.