How can I store information about class of object when I save it to Mongo? I'm using Scala and Play.
More details. Lets say, we have Trait User and 2 implementations: Admin and Member. And then we try to save them to one Mongo collection.
class UserDao {
private def collection = ReactiveMongoPlugin.db.collection[JSONCollection]("users")
def save(user:User):Future[User] = {
collection.save(user) //Fail
}
}
And we get error. Because we need Reads and Writes for Trait which are really ugly and complicated things..
Before I've written application in Java and Spring Mongo and there wasn't any problems. Spring automagically add _class field to this bson object, which is stored in Mongo. And after read from this collection spring knows which should create. And in Play there is nothing like that.
Please, help me..
I found information, that this is impossible in play reactive mongo driver and generally in play mongo. So there is only one way- write your own driver or think about different solution.
Related
I'm trying to use the Play-ReactiveMongo plugin to read/write simple records in MongoDB with Play and Angular. The plugin seems like a nice option as it allows you to use simple case classes and regular JSON instead of explicitly converting between BSON and JSON. But the few examples of using the plugin don't seem to cover how to map the MongoDB Object ID to/from JSON within the same framework. This all seems to work with a load of implicit (= magic to me) Reads/Writes in the background, but they don't seem to handle the Object ID.
My code is based on Alex Lashford's Modern Web Template, and very similar to Stephan Godbillion's example using JSON Read/Writes, but neither Alex nor Stephan shows anything to do with the MongoDB object ID.
I need some kind of unique ID for my data records, so I can fetch and update them etc, and it makes sense to use the one MongoDB provides, but I can't seem to find a way to use this cleanly within the Play ReactiveMongo plugin.
Does anybody know of an example that shows how to use Play ReactiveMongo plugin with JSON collections and some way to map the Object ID to/from JSON without having to convert all my processing to use BSON?
I've solved this issue by creating another case class:
case class Id($oid: String)
then use it as follows:
case class User(_id: Id, ...)
You have to have Json converters imported
implicit val idFormat = Json.format[Id]
implicit val userFormat = Json.format[User]
I don't know why the reactivemongo team decided to have an ObjectId in BSON, but not in JSON. Anyway, you can construct the json representation of the MongoDB ObjectId as follows:
import play.api.libs.json._
def objectId(id: String) = Json.obj("$oid" -> id)
yourCollection.find(Json.obj("_id" -> objectId(id))).cursor()...
Below code gives error and it says School class must implement DBObject interface. The problem is that this interface has tons of methods. I have nearly 100 class and I don't want to write millions of methods. Is there any easy way to save an object?
DBCollection table = db.getCollection("school");
School document = new School();
table.insert(document);
Instead of implementing DBObject or extending one of the existing implementations like BasicDBObject, you could have all objects which can be saved in the database have a method public DBObject toDBObject() which creates and returns a DBObject representation of the object. The BasicDBObject is a Map<String, Object> which handles the object data as key/value pairs, so it is a good candidate for this.
For a more generic solution, you could use reflection to create a method which can convert any Java object into a DBObject. To have more control over this, you could make up some annotations, add them to your classes and have your conversion method check them.
Now you have created your own object mapping framework for MongoDB. But why reinvent the wheel when others have already done it? So before you do this, check out if the existing mapping frameworks like morphia fulfill your use-case - they likely do and will save you hours of programming and weeks of debugging.
[opinion]
I usually despise object-relational mappers in the context of relational databases because of the impedance mismatch problem, but for heterogeneous databases like MongoDB they make a lot more sense, because you can store objects which have the same base-class but also some different class-specific fields in the same table collection without any ugly workarounds.
[/opinion]
I am using Spring Data for MongoDB to persist my domain objects. I was wondering if there is a way (perhaps with an Annotation?) to prevent Spring Data from persisting certain fields into MongoDB?
Does someone know how to do that or do I have to write my own Mapper?
Thanks.
In this case use the #Transient annotation for the field you need to ignore.
Look more over here - Transient
In case you are looking for the actual package like I was, this one will work:
import org.springframework.data.annotation.Transient;
Which is from the Spring framework API documentation.
But this one, which is a JPA annotation, will not work for Spring Data's MongoDB:
import javax.persistence.Transient;
Which is part of the Java Persistence API.
I'm working on a project with Scala, Salat, Casbah, Mongo, Play2, BackboneJS... But it's quite a lot of new things to learn in the same time... I'm ok with Scala but I find my code crappy and I don't really know what's the solution to improve it.
Basically my usecase is:
A MongoDB object is sent to the browser's JS code by Play2
The JS code update the object data (through a Backbone model)
The JS send back the the updated JSON to the server (sent by Backbone save method, and received by Play with a json bodyparser)
The JSON received by Play should update the object in MongoDB
Some fields should not be updatable for security reasons (object id, creationDate...)
My problem is the last part.
I'm using case classes with Salat as a representation of the objects stored in MongoDB.
I don't really know how to handle the JSON i receive from the JS code.
Should I bind the JSON into the Salat case class and then ask Mongo to override the previous object data by the full new case class object?
If so is there a way with Play2 or Salat to automatically create back the case class from the received JSON?
Should I handle my JSON fields individually with $set for the fields I want to update?
Should i make the elements of my case class mutable? It's what we actually do in Java with Hibernate for exemple: get the object from DB, change its state, and save it. But it doesn't seem to be the appropriate way to do with Scala...
If someone can give me some advices for my usecase it would be nice because I really don't know what to do :(
Edit: I asked a related question here: Should I represent database data with immutable or mutable data structures?
Salat handles JSON using lift-json - see https://github.com/novus/salat/wiki/SalatWithPlay2.
Play itself uses Jerkson, which is another way to decode your model objects - see http://blog.xebia.com/2012/07/22/play-body-parsing-with-jerkson/ for an example.
Feel free to make a small sample Github project that demonstrates your issue and post to the Salat mailing list at https://groups.google.com/group/scala-salat for help.
There are really two problems in your question:
How do I use Play Salat.
How do I prevent updates to certain fields.
The answer to your first question lies in the Play Salat documentation. Your second question could be answered a few ways.
a. When the update is pushed to the server from Backbone, you could grab the object id and find it in the database. At that point you have both copies of the object. At that point, you can fire a business rule to make sure the sender didn't attempt to change those fields.
or
b. You could put some of your fields in another document of an embedded document. The client would have access to them for rendering purposes but your API wouldn't allow them to be pushed back to Mongo.
or
c. You could write a custom update query that ignores the fields you don't want changed.
Actually the answer is pretty simple: I didn't know there was a built-in copy method on case classes that allows to copy an immutable case class while changing some data.
I don't have nested case class structures but the Tony Morris suggestion of using Lenses seems nice too.
I've yet to use Morphia, but I'm considering it for a current project.
Suppose I have a POJO with a number of #Reference annotations and I ask Morphia to fetch the object graph from the database. If I then make another DAO or DataStore call and ask Morphia to fetch some object that was already instantiated in the first graph, would Morphia return a reference to the already instantiated object or would it create a new instance?
If Morphia returns a new instance of the object each time, does anyone have a recommendation of how to best approach creating a Morphia-backed repository that won't duplicate already-instantiated objects?
As I see it in Morphia, it will re read every reference.
This is one of the problems, why I created Morphium. I integrated a caching layer there, so if you read a reference, this one won't be read again (at least, if you search by ID...)
We use morphia in production and there are two ways to make sure you don't load the references which is something we came across too.
One is to use the lazy loading option when you define the #Reference element in your main class. This of course means that this behavior is 'global' to that object.
The better way to do this is to not define an #Reference using Morphia and instead managing the references yourself. Let me know if you need a code sample.
I've stopped using #Reference too and instead declare something like:
ObjectId itemId
rather than having a field item. This has 2 benefits: (1) it lets me define a getter through a helper getObject(...) method which I have written with object caching and (2) it stores a simple ObjectId in the Mongo object rather than a full DBRef which includes the collection name and thus about twice the data size.