ReactiveMongo Macros.handler crashes after new field is added - mongodb

I recently add a new field to my scala case class since I want to start keeping track of that field in my MongoDB.
let's say case class is like this:
case class MyItem (
var existingField: String,
var newlyAddedField: String
) {
}
I use this to serialize/deserialize json and bson to my object:
object MyItem {
import play.api.libs.json.Json
// generate reader and writer:
implicit var jsonFormats = Json.format[MyItem]
implicit var bsonFormats = Macros.handler[MyItem]
}
As all the existing data in my db doesn't have newlyAddedField, I get runtime exception
reactivemongo.bson.exceptions.DocumentKeyNotFound: The key 'newlyAddedField' could not be found in this document or array
Could anyone help? I've read about writing my own serialization/deserialization but I am not sure how to do that as I am using Play Framework whose syntax and way to do things are all different among its versions. I hope there is a simpler way as adding field should be common in NoSQL db. Btw, I am using play framework 2.5
Thanks in advance!

AFAIU it Macros.handler is null-safe i.e. it doesn't sets value to null if there is no field. I think the simplest and the cleanest fix this in Scala is to declare your new field as Option[String] so everyone (including the macro code-generator) will see that this field might be absent. And this seems to be what the doc suggests as well

Related

Scala Twenty Two Again

Scala case classes can have 22+ properties these days, but AFAIA compiler does not compile apply/unapply methods then.
Is there a way to generate apply/unapply by means of a plugin at compile time or at least generate methods using IDE etc?
Note
please don't start asking - why do you need this? It is for mapping existing JSON schema from a mongoDB using Reactive Mongo
please don't advise to group properties into smaller case classes and etc. Schema was created by someone else & already exists on production.
Thank you for your answers in advance.
Yes, Scala supports >22 fields from version 2.11. However, there are certain limitations - the case class will no more have unapply or unapplyseq and tupled(you'll no longer convert case class to tuple) functions because scala still don't support tuple with more that 22 values.
val tup = (1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22) //will compile
val tup = (1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23) //will fail
Because of this, case class is much more like regular class class and many other libraries will be unable to fully utilize this case class. Such as, json serializer libraries.
I have faced with this issue when I tried to use macros read/write to serialize case class to json and viceversa in a playframework project it won't compile because case class no longer contain unapply() method. The one work around for this is to provide custom implicit read/writes for the case class instead of using macros.
case class Person(name: String, age: Int, lovesChocolate: Boolean)
implicit val personReads = Json.reads[Person] //this wond work, need to write custom read as below.
implicit val personReads = (
(__ \ 'name).read[String] and
(__ \ 'age).read[Int] and
(__ \ 'lovesChocolate).read[Boolean]
)(Person)
please don't start asking - why do you need this? It is for mapping
existing JSON schema from a mongoDB using Reactive Mongo
I'm assuming your is the same situation, you are using reactivemongo macros for json to/from case class serialization.
implicit val personReader: BSONDocumentReader[Person] = Macros.reader[Person]
implicit val personWriter: BSONDocumentWriter[Person] = Macros.writer[Person]
//or Handler
Macros.handler[Person]
Therefore, I would suggest you to use custom BSON reader and writer for the case class as documented here.

Scala immutability in persistent storage with Squeryl

So as I am reading the Play for Scala book, I came across something odd that was explained in the book. This is the relevant snippet:
There's something strange going on, though. If you're using immutable
classes—which vanilla case classes are—you might be worried when you
discover that Squeryl updates your object's supposedly immutable id
field when you insert the object. That means that if you execute the
following code,
val myImmutableObject = Product(0, 5010255079763,
"plastic coated blue", "standard paperclip, coated with blue plastic")
Database.productsTable.insert(myImmutableObject)
println(myImmutableObject)
the output will unexpectedly be something like: Product(13,
5010255079763, "plastic coated blue", "standard paperclip, coated with
blue plastic"). This can lead to bad situations if the rest of your
code expects an instance of one of your model classes to never change.
In order to protect yourself from this sort of stuff, we recommend you
change the insert methods we showed you earlier into this:
def insert(product: Product): Product = inTransaction {
val defensiveCopy = product.copy
productsTable.insert(defensiveCopy)
}
My question is, given that the product class is defined like this:
import org.squeryl.KeyedEntity
case class Product(
id: Long,
ean: Long,
name: String,
description: String) extends KeyedEntity[Long]
Database object is defined like this:
import org.squeryl.Schema
import org.squeryl.PrimitiveTypeMode._
object Database extends Schema {
val productsTable = table[Product]("products")
...
on(productsTable) { p => declare {
p.id is(autoIncremented)
}}
}
How then is it possible that a case class declared as val can have one of its fields changed? Is Squeryl using reflection of some sort to change the field or is the book mistaken somehow?
I am not able to run the examples to verify what the case might be, but someone who has used Squeryl can perhaps give an answer?
You can check the definition of table method for yourself:
https://github.com/squeryl/squeryl/blob/master/src/main/scala/org/squeryl/Schema.scala#L345
It's a generic function which does use reflection to instantiate the Table object bound to the given case class. Functions are first-class citizens in Scala, so they can be assigned to a val just like anything else.
The last fragment is also an asynchronous function, which maps a given argument to some modification defined for it.

When to use object and when to use class in Scala

I have a service object called MyService with functions defined that are used by my Play application's controllers. One particular function in MyService is parsing some text, and turning it into a JSON object. So my process will be:
Parse some text containing unstructured book info (title, author etc) into some Scala objects (Book objects)
Convert the Book objects into JSON format
Return the JSON
What I am wondering is, in the step where I parse the text and create my Scala objects, how should I define them? If this was Java I would just have an inner class named 'Book', but with Scala I don't know whether I should define an inner object or inner class inside my MyService object, and I don't know why/when I would choose one over the other.
Could someone explain when to use an object and when to use a class?
You use object when you only want EXACTLY ONE instance of your class.
objects can't have parameters and it's methods and values can be accessed via MyObject.myMethod.
In Java you would use the singleton pattern to achieve what object is in Scala.
Then you would have something like MyObject.getInstance().myMethod.
In your case you want to parse information into a class. I would make the parser class an object (assuming the parsing process is static).
The result however is not static, since it relies on the parsed data. Book should definitly be a class. (If there is exactly one Book, the parsing would not make much sense, would it?)
#Kigyo's answer is fine, but there's something else that should be addressed here. You're working within an MVC framework, and this Book you're describing sounds exactly like a model. This MyService object you describe is starting to sound like it's very bloated, and serializing a Book as JSON is something you can do entirely within a Play model.
package models
import play.api.libs.json._
case class Book(title: String, author: String, pages: Int)
object Book {
/** Use one of Play's json macros to define an implicit serializer/deserializer */
implicit val jsonFormat: Format[Book] = Json.format[Book]
def parse(rawText: String): Book = {
// Move your book parsing logic to here and construct the new Book instance.
}
}
Then in your controller function you would do something like:
import play.api.libs.json._
val rawText: String = ... // The raw text you're parsing the book data from.
val book: Book = Book.parse(rawText) // The parsed Book
val json: JsValue = Json.toJson(book) // The Book as a Play JSON object (can be returned as a JSON response)
val jsonString: String = Json.stringify(json) // The Book JSON as a String object, in case you need that..
This is a little more verbose than it needs to be, but I separated it out line-by-line to illustrate what's happening at each step. If all you wanted was the JSON as a string Jsons.stringify(Json.toJson(Book.parse(rawText))) would suffice.
Now the Book logic is self-contained, and not cluttering up another object.

Case classes for formatting json - with and without the object id

I am writing a play2 app that gets data via rest/json and stores it in mongodb using reactivemongo.
I am using a model built from case classes and implicit val myFormat = Json.format[myCaseClass]
Currently I have a case class for objects coming from mongodb. They contain the _id field and everything works. New objects coming in do naturally don't have this id field and so the Json.fromJson[myCaseClass](req.body) validator fails.
Do I really have to create another case class for new objects or is there a more DRY and elegant solution without duplicating the class and removing the _id?
I would use the parser combinator API and create a json format, or maybe even just a Reads[T], that handles incoming possibly id-less fields. Something like:
implicit val readsMyClass: Reads[MyClass] = (
(__ \ "id").readNullable[Id] and
(__ \ "someProperty").read[String]
)(create _)
def create(maybeId: Option[Id], someProperty: String) =
MyClass(maybeId.getOrElse(...generate id...), someProperty)
See the docs for more info: http://www.playframework.com/documentation/2.2.x/ScalaJsonCombinators
I followed the suggestions and _id: Option[BSONObjectID] does the trick.
It was not necessary to implement a reader because implicit val userFormat = Json.format[User] is able to create a macro containing the options.

squeryl date to long conversion

I want to store java.util.Date (or Timestamp) as an int (or Long) in my db using Squeryl.
I'd like to control how the date is transformed to the number and vice versa.
How can I achieve this?
I'm pretty new to scala/squeryl, coming from java/hibernate.
Back in java/hibernate I could create user types and either register them globaly or use them localy on a field with an annotation. This user type defined methods for how to persist the object type to db and how to load it from the db.
I read some of the squeryl and scala docs, noticed two things:
there are custom types
there is a implicit function mechanism that is called for conversions
I know one of these can help me but I didn't find any good full examples to understand how.
Any help is appreciated!
Please see this example :
https://github.com/max-l/squeryl-extended-field-types-example/blob/master/src/main/scala/example/MyCustomTypes.scala
in your case, you subsitute TTimestamp for TLong (the backing JDBC type) and DateTime for Date (you might want consider using the joda date though).
implicit val dateAsLongTEF = new NonPrimitiveJdbcMapper[Long, Date, TLong](longTEF, this) {
def convertFromJdbc(v: Long) = new Date(v)
def convertToJdbc(v: Date) = v.getTime
}
implicit val optionDateAsLongTEF =
new TypedExpressionFactory[Option[Date], TOptionLong]
with DeOptionizer[Long, Date, TLong, Option[Date], TOptionLong] {
val deOptionizer = dateAsLongTEF
}
Note: the fact that you use TLong and TOptionLong means that you'll be able to
compare a number column to a long backed column in the DSL.
Update: there is a limitation that prevents re registering a primitive type,
so you'll need to have a wrapper type, I updated the example in the github project...