I am refactoring an application to use the mongo-scala driver version 2.0 instead of reactivemongo. From working with reactivemongo and casbah I have come to expect to be able to update a document by providing a query to find the target document and a new document to update it with.
The method in Casbah looks like this (copied from here.)
def update[A, B](q: A, o: B, <...some other stuff...>):
TypeImports.WriteResult
Performs an update operation.
q: search query for old object to update
o: object with which to update q
It seems I cannot do this using the mongo-scala driver but instead must provide each field to be updated and its new value. The problem is that the only reliable way to update everything that may have changed is to pass in a new document.
So I wonder whether I am just missing something obvious or there really is no way to do what I want with the mongo-scala driver? Has anyone found a reasonable work-around for this missing functionality?
Related
I am writing a CRUD app using Quarkus and Mongo, and thus am using a MongoCollection to implement this.
I am utilizing Hibernate Validators for validation to ensure my data is as it should be.
The issue I am running into is that the MongoCollection only provides updates using Bson (collection.updateOne(Bson search, Bson update)), and not for the entire object. This would be fine but keeps me from being able to properly use validators to ensure proper data integrity.
Until I hit this block, my idea for updating was to:
ingest generic json, in the form of ObjectNode and the object's id
get the object to update
Use Jackson's built-in updating features to apply the updates to the object from the given ObjectNode
Validate the resulting state
save the object to Mongo
However, this doesn't work when I can't update the whole object at once. Am I attacking this from the right angle? I've found a lot on how to do updates, but not a lot related to validation. I also see that I can specify on the Mongo side validation rules, but as I am fairly 'hands off' when using Mongo in this way, so needing to apply special Bson validation isn't ideal.
Is it possible for me to just re-insert the updated object to Mongo using `collection.insertOne(object)`? this assumes that the object would have the same `_id` as the original. Would this update the object as intended, or are there side effects?
Edit:: no, it is not. Mongo throws an error for duplicate keys.
Found it, what I wanted was collection.findOneAndReplace()
I'm trying to use the Play-ReactiveMongo plugin to read/write simple records in MongoDB with Play and Angular. The plugin seems like a nice option as it allows you to use simple case classes and regular JSON instead of explicitly converting between BSON and JSON. But the few examples of using the plugin don't seem to cover how to map the MongoDB Object ID to/from JSON within the same framework. This all seems to work with a load of implicit (= magic to me) Reads/Writes in the background, but they don't seem to handle the Object ID.
My code is based on Alex Lashford's Modern Web Template, and very similar to Stephan Godbillion's example using JSON Read/Writes, but neither Alex nor Stephan shows anything to do with the MongoDB object ID.
I need some kind of unique ID for my data records, so I can fetch and update them etc, and it makes sense to use the one MongoDB provides, but I can't seem to find a way to use this cleanly within the Play ReactiveMongo plugin.
Does anybody know of an example that shows how to use Play ReactiveMongo plugin with JSON collections and some way to map the Object ID to/from JSON without having to convert all my processing to use BSON?
I've solved this issue by creating another case class:
case class Id($oid: String)
then use it as follows:
case class User(_id: Id, ...)
You have to have Json converters imported
implicit val idFormat = Json.format[Id]
implicit val userFormat = Json.format[User]
I don't know why the reactivemongo team decided to have an ObjectId in BSON, but not in JSON. Anyway, you can construct the json representation of the MongoDB ObjectId as follows:
import play.api.libs.json._
def objectId(id: String) = Json.obj("$oid" -> id)
yourCollection.find(Json.obj("_id" -> objectId(id))).cursor()...
I'm working on a project with Scala, Salat, Casbah, Mongo, Play2, BackboneJS... But it's quite a lot of new things to learn in the same time... I'm ok with Scala but I find my code crappy and I don't really know what's the solution to improve it.
Basically my usecase is:
A MongoDB object is sent to the browser's JS code by Play2
The JS code update the object data (through a Backbone model)
The JS send back the the updated JSON to the server (sent by Backbone save method, and received by Play with a json bodyparser)
The JSON received by Play should update the object in MongoDB
Some fields should not be updatable for security reasons (object id, creationDate...)
My problem is the last part.
I'm using case classes with Salat as a representation of the objects stored in MongoDB.
I don't really know how to handle the JSON i receive from the JS code.
Should I bind the JSON into the Salat case class and then ask Mongo to override the previous object data by the full new case class object?
If so is there a way with Play2 or Salat to automatically create back the case class from the received JSON?
Should I handle my JSON fields individually with $set for the fields I want to update?
Should i make the elements of my case class mutable? It's what we actually do in Java with Hibernate for exemple: get the object from DB, change its state, and save it. But it doesn't seem to be the appropriate way to do with Scala...
If someone can give me some advices for my usecase it would be nice because I really don't know what to do :(
Edit: I asked a related question here: Should I represent database data with immutable or mutable data structures?
Salat handles JSON using lift-json - see https://github.com/novus/salat/wiki/SalatWithPlay2.
Play itself uses Jerkson, which is another way to decode your model objects - see http://blog.xebia.com/2012/07/22/play-body-parsing-with-jerkson/ for an example.
Feel free to make a small sample Github project that demonstrates your issue and post to the Salat mailing list at https://groups.google.com/group/scala-salat for help.
There are really two problems in your question:
How do I use Play Salat.
How do I prevent updates to certain fields.
The answer to your first question lies in the Play Salat documentation. Your second question could be answered a few ways.
a. When the update is pushed to the server from Backbone, you could grab the object id and find it in the database. At that point you have both copies of the object. At that point, you can fire a business rule to make sure the sender didn't attempt to change those fields.
or
b. You could put some of your fields in another document of an embedded document. The client would have access to them for rendering purposes but your API wouldn't allow them to be pushed back to Mongo.
or
c. You could write a custom update query that ignores the fields you don't want changed.
Actually the answer is pretty simple: I didn't know there was a built-in copy method on case classes that allows to copy an immutable case class while changing some data.
I don't have nested case class structures but the Tony Morris suggestion of using Lenses seems nice too.
I need to store Scala class in Morphia. With annotations it works well unless I try to store collection of _ <: Enumeration
Morphia complains that it does not have serializers for that type, and I am wondering, how to provide one. For now I changed type of collection to Seq[String], and fill it with invoking toString on every item in collection.
That works well, however I'm not sure if that is right way.
This problem is common to several available layers of abstraction on the top of MongoDB. It all come back to a base reason: there is no enum equivalent in json/bson. Salat for example has the same problem.
In fact, MongoDB Java driver does not support enums as you can read in the discussion going on here: https://jira.mongodb.org/browse/JAVA-268 where you can see the problem is still open. Most of the frameworks I have seen to use MongoDB with Java do not implement low-level functionalities such as this one. I think this choice makes a lot of sense because they leave you the choice on how to deal with data structures not handled by the low-level driver, instead of imposing you how to do it.
In general I feel that the absence of support comes not from technical limitation but rather from design choice. For enums, there are multiple way to map them with their pros and their cons, while for other data types is probably simpler. I don't know the MongoDB Java driver in detail, but I guess supporting multiple "modes" would have required some refactoring (maybe that's why they are talking about a new version of serialization?)
These are two strategies I am thinking about:
If you want to index on an enum and minimize space occupation, you will map the enum to an integer ( Not using the ordinal , please can set enum start value in java).
If your concern is queryability on the mongoshell, because your data will be accessed by data scientist, you would rather store the enum using its string value
To conclude, there is nothing wrong in adding an intermediate data structure between your native object and MongoDB. Salat support it through CustomTransformers, on Morphia maybe you would need to do the conversion explicitely. Go for it.
In casbah, there are two methods called .getAs and .getAsOrElse in MongoDBObject, which returns the relevant fields' values in the type which given as the type parameter.
val dbo:MongoDBObject = ...
dbo.getAs[String](param)
This must be using type casting, because we can get a Long as a String by giving it as the type parameter, which might caused to type cast exception in runtime. Is there any other typesafe way to retrieve the original type in the result?
This must be possible because the type information of the element should be there in the getAs's output.
Check out this excellent presentation on Salat by it's author. What you're looking for is Salat grater which can convert to and from DBObject.
Disclamer: I am biased as I'm the author of Subset
I built this small library "Subset" exactly for the reason to be able to work effectively with DBObject's fields (both scalar and sub-documents) in a type-safe manner. Look through Examples and see if it fits your needs.
The problem is that mongodb can store multiple types for a single field, so, I'm not sure what you mean by making this typesafe. There's no way to enforce it on the database side, so were you hoping that there is a way to enforce it on the casbah side? You could just do get("fieldName"), and get an Object, to be safest--but that's hardly an improvement, in my opinion.
I've been happy using Salat + Casbah, and when my database record doesn't match my Salat case class, I get a runtime exception. I just know that I have to run migration scripts when I change the types in my model, or create a new model for the new types (multiple models can be stored in the same collection). At least the Salat grater/DAO methods make it less of a hassle (you don't have to specify types every time you access a variable).