Modified Count when updating a entity with MongoRepository - mongodb

I am using Spring Data MongoDB Repository to connect to my mongo database.
I want to update a document in a collection by passing the criteria and specific fields to update.
I do that by setting the entity object directly with the fields to update along with _id field which would be used as criteria.
This is the code I am using,
Employee updatedEmployee=employeeRepository.save(employeeToUpdate);
When I used the save method, I see I do not get any return status , whether the update was successful or not.
I do not want to make another query to mongodb to fetch the document and compare it with my updated document to validate the changes?
Is there a way with repository or should I use MongoTemplate for this specific usecase alone.

Related

Change document name at runtime in Spring Boot Mongo DB

I am fetching data from api and storing it in database ,I have to delete the whole document and refresh data from api and store it again ,on live server doing this will cause delay for users to get back the data again. Is there any possibility to change the
#Document(collection = "events")
#JsonIgnoreProperties(ignoreUnknown=true)
collection name at runtime ?? .I have gone through following link -Changing Table name dynamiclly in JPA/Hibernate for JPA but that doesnt solve my query any help would be really appreciated.
Instead of specifying the collection name in the Document annotation I would suggest using MongoTemplate#insert(Object document, String collectionName) similarly to this example from the Spring Data MongoDB Documentation.
This way you could specific collection name at runtime using a property or environment variable.

mongodb schema design - add collection or extend existing one

There's a collection with 100.000 documents. Only 10 of them must have additional property that is not necessary for other documents (e.g. list of departments with only top ones have property 'Location');
As far as I understand both approaches should work just fine, but which one is preferable since using noSql db:
add one more collection with documents that have 2 property: DepartmentId, Location.
add property 'Location' to only selected documents, so others won't have it.
The problem you are facing is well known. You have the same with source code for example.
When you are updating a piece of code, do you save it as User.js, User2.js, User3.js ... ?
Or do you use a versionning system like git and have an unique User.js?
Translating the git analogy to your issue, you should update the current data.
In mongodb you actually have two choice to perform the update.
Update the model in your code, and update every entry in database to match the new model.
Create a new model that will apply to new entries, and still have the old model to handle old formatted data.
use-more-than-one-schema-per-collection-on-mongodb

Javascript function to save new fields from existing fields (migration) in mongodb collection

We have a design change in application to accommodate few new requirements. Design change forced us to migrate one of mongodb collection, instead of having individual fields, have to create a derived JSON string as a field from existing fields.
The migration process will be invoked by end-user doing a action in UI (like saving a change). But that one action might update few thousands of documents. So we would like to write Javascript code to be executed on server side, so that we can avoid loading many records to application.
But the issue we running into is, cannot call the java script function using eval as the collection is sharded. And other option we cannot consider is to make the collection un-sharded as the migration has to happen on live system.
Please help us if you know of any alternate approach.
Example migration : ExampleDoc (collection) has fields a1, a2, b1 and b2. The migration will create a new fields called fieldJSON : { a : "", b : ""}. Here a and b are derived from existing fields a1, a2, b1 and b2.
OK, now I understand that
you want to create a new field into the same collection which is sharded;
the content of this new field is generated by existing fields;
you don't want to fetch these existing fields from database to application for handling perhaps because of large volume;
you can't invoke eval database command because it's a sharded collection;
you can't un-shard the current collection.
Then, is it possible to fulfill the intent through mapReduce?
Query the exact document you want to update;
mapping, reducing and then overwriting the original document of this collection by specifying some parameters such as {out:{merge:<collectionName>, sharded:true}}.

Spring Data partial upsert not persisiting type information

I am using Spring Data with MongoDB to store very dynamic config data in a toolkit. These Config objects consist of a few organizational fields, along with a data field of type Object. On some instances of Config, the data object refers to a more deeply nested subdocument (such as "data.foo.bar" within the database. – this field name is set by getDataField() below). These Config objects are manipulated as they're sent to the database, so the storage code looks something like this:
MongoTemplate template; // This is autowired into the class.
Query query; // This is the same query which (successfully) finds the object.
Config myConfig; // The config to create or update in Mongo
Update update = new Update()
.set(getDataField(), myConfig.getData())
.set(UPDATE_TIME_FIELD, new Date())
.setOnInsert(CREATE_TIME_FIELD, new Date())
.setOnInsert(NAME_FIELD, myConfig.getName());
template.upsert(query, update, Config.class);
Spring recursively converts the data object into a DBObject correctly, but neither the data document nor any of its subdocuments have "_class" fields in the database. Consequentially, they do not deserialize correctly.
These issues seem quite similar to those previously reported in DATAMONGO-392 , DATAMONGO-407, and DATAMONGO-724. Those, however, have all been fixed. (I am using spring-data-mongodb 1.4.2.RELEASE)
Am I doing something incorrectly? Is there a possibility that this is a Spring issue?
Came across a similar issue. One solution is to write your own Converter for Config.class.

Changing the MongoDB collection on run time in symfony2 + doctrine

I'm using Symfony2 MongoDB + Doctrine and I want to tell doctrine to save my objects in collections with different name from the name of the class that defines the object. Also the name of the new collection should be the id of an object in different collection. For example I have a class called Posts and I want to save them in a collection named after the ID of the user in the original User collection. This means that I need to tell doctrine to save all new posts in a collection named e.g. User555 and I should be able to tell doctrine to create this collection and save there during runtime.
I can see that I can change the name of the collection statically with configuring it in the files like here: Storing a document in a different collection - Mongodb with symfony2
But I need to be able to create collections at runtime.
I will be thankful if someone points me in the right direction!
Cheers.
When you use the ORM you can do
$em->getClassMetadata('\AcmeBundle\Entity\Something')->setTableName('test')
Using the ODM you should be able to do
`$dm->getClassMetadata('\AcmeBundle\Document\Something')->setCollection('test')
I looked through the Doctrine code and it's possible. However note that by doing this you're changing the collection used for that Document for the life of the script, unless you set it back.
I don't know of any reasonable way to do this for just one entity at a time. Probably would be best to wrap the ODM by creating your own persister service.