Spring Data Mongo Repository update operation - mongodb

I am using Spring Data Mongo repository for persisting my entities. The parent class of all entities looks like this:-
#Document
public abstract class AbstractEntity {
#Id
private String id;
#CreatedDate
private Date dateCreated;
#LastModifiedDate
private Date lastUpdated;
#Version
private Long version; // This is creating trouble while 'update' operation
}
This is how I configure Mongo repositorries and auditing:-
#Configuration
#EnableMongoRepositories(basePackages = { "x.y.z" })
#EnableMongoAuditing
#EnableAutoConfiguration
public class MongoRepositoryConfig {
}
I am able to save and 'UPDATE' my entities to Mongo until I don not include the #Version field in my entity for auditing.
PROBLEM
If I use the #Version auditing field in my entity class, while trying to update a entity/document using MongoRepository#save(entity) method I am getting following exception:-
Caused by: com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "localhost:27017" , "ok" : 1 , "n" : 0 , "err" : "insertDocument :: caused by :: 11000 E11000 duplicate key error index: test.MENU_ITEM.$_id_ dup key: { : ObjectId('541ed581f39d6f87787067e3') }" , "code" : 11000}
at com.mongodb.CommandResult.getWriteException(CommandResult.java:88)
at com.mongodb.CommandResult.getException(CommandResult.java:79)
at com.mongodb.DBCollectionImpl.translateBulkWriteException(DBCollectionImpl.java:314)
at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:189)
at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:165)
at com.mongodb.DBCollection.insert(DBCollection.java:93)
at com.mongodb.DBCollection.insert(DBCollection.java:78)
at com.mongodb.DBCollection.insert(DBCollection.java:120)
at org.springframework.data.mongodb.core.MongoTemplate$8.doInCollection(MongoTemplate.java:900)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:410)
Why upsert operation is failing when the entity has #Version? From what I understand, the version field is used for optimistic locking while update operation.
The save save methods seems to be trying to do an insert operation instead of update. Is it expected behaviour with #Version?
<spring.data.mongo.version>1.6.0.RELEASE</spring.data.mongo.version>

Cleaning the Mongo database fixed the problem.
Before I added the #Version Long version, property in my domain class, I had few existing mongo documents in the collection without version property. This was causing the problem.

It seems you are trying to save a document on which version is not matching with the document in mongo collection. It is either not set or some old value is set as version. MongoDB uses #version for optimistic locking which means it matches the version of document in the database with the version on the entity which is being saved. It throws an error if there is a version mismatch.
Try below code and it should get you over this error.
public void updateUser(String userId, String password) {
//Before updating a document for specified userId, fetch the latest document from db
Login userFromDB = loginRepository.findOne(userId);
userFromDB.setPassword(password); //set the field to be updated
userRepository.save(userFromDB); //this will overwrite the document in database with new pwd
}
On saving, MongoDB will increment the Version number.

Related

mongodb - compound unique index using spring data not working

I am trying to create unique compound index in mongodb using spring data.
But I see that the index is not created and the duplicate document is created in the DB.
My entity class:
#Document
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
#CompoundIndexes(
#CompoundIndex(name = "daybook_index", def = "{'date' : 1, 'vehicleNumber' : 1}", unique = true)
)
public class Daybook {
private String date;
private String vehicleNumber;
private String unit;
}
I am using repository.insert() method to create the document.
When I see in mongo express I see only one index created on _id and the index defined in the entity class is not created.
Is it a bug in spring data or am I doing something wrong?
P.S.: I tried to delete the collection too before running the application but didn't help.
As of Spring Data MongoDB 3.0, automatic index creation is turned off by default.
To turn it on you might use the proper flag overriding the method from MongoConfigurationSupport:
public class MongoConfiguration extends AbstractMongoClientConfiguration {
.....
#Override
protected boolean autoIndexCreation() {
return true;
}
}
otherwise you might create the index with appropriate instructions.
mongoOperations.indexOps(Daybook.class).ensureIndex(new Index().on("date", Direction.ASC).on("vehicleNumber", Direction.ASC).unique());
As #Saxon mentioned:
Spring Data MongoDB 3.0, automatic index creation is turned off by default.
Instead of adding code, I am able to create the index by adding the spring data configuration in application.properties
spring.data.mongodb.auto-index-creation=true

Java Spring Data MongoDB

I am using Spring data mongo with azure cosmos. My structure looks like below.
I have an Id field in my collection that is not annotated with #Id. I see both _id and id are in the DB but When I retrieve id field comes with the value is in _id.
#Document(collection = "mycollection")
class MyObject{
private String id;
...
}
public interface MyRepository extends MongoRepository<MyObject, Void> {
}
Used #Field("id") to tell spring data threat this field as is not as _id/pk field for mongo

Spring MongoRepository not returning id field of nested objects [duplicate]

I have a document with an array field with
"chapters": [
{ "id" : "14031871223912313", ...}
...
]
I would like to query return the id's with Spring Data's MongoTemplate using the following:
class Chapter {
private String id;
public String getId() {
return id;
}
}
This way the id is not populated. I have tried using the different mapping options with #Field described here http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mapping.conventions.id-field
What am I doing wrong? I know I can always to back to mongo java driver, but I thought this should work.
Thanks in advance for any help.
Found a solution. It is populated via:
#Field("id")
private String chaperId
In MongoDB id's are _id, and every document in mongo has an _id. From the document you linked to, Spring will map #Field String id to mongo's _id field. You probably want to use a #Field('id') String id field mapping to indicate that the field you want is id not _id.

MongoDB with Spring data - Duplicated query from driver

I'm having a duplicate query when performing a simple query. The files:
SomeClass.java:
#Document(collection = "someCollection")
public class SomeClass {
private String _id;
private String someField;
//...
}
SomeClassRepository.java:
#Repository
public interface SomeClassRepository extends MongoRepository<SomeClass, String> {
}
Service.java:
#Autowired
private SomeClassRepository someClassRepository;
public SomeClass find(String id){
return someClassRepository.findOne(id);
}
application.properties:
logging.level.org.springframework.data.mongodb.core.MongoTemplate=DEBUG
Log file:
14:14:46.514 [qtp1658534033-19] DEBUG o.s.data.mongodb.core.MongoTemplate - findOne using query: { "_id" : "40c23743-afdb-45ca-9231-c467f8e8b320"} fields: null for class: class com.somepackage.SomeClass in collection: someCollection
14:14:46.534 [qtp1658534033-19] DEBUG o.s.data.mongodb.core.MongoTemplate - findOne using query: { "_id" : "40c23743-afdb-45ca-9231-c467f8e8b320"} in db.collection: someDatabase.someCollection
I also tried to:
1) use #Id annotation with a field named "someId"
2) use #Id annotation with a field named "id"
3) use a field named "id" (without #Id annotation)
Unfortunately, I always have two queries to the database.
Anyone knows how to perform a single query?
Thanks!
Its only single query that is sent to database. Your log messages are coming from two different places.
First place : doFindOne method - link; Second place :
FindOneCallback class - link
You can also confirm the logs by looking at db logs. More info here

Eagerly load MongoDB #DBRef in Spring data's RepositoryRestResource

I'm trying to implement a rest api using RepositoryRestResource and RestTemplate
It all works rather well, except for loading #DBRef's
Consider this data model:
public class Order
{
#Id
String id;
#DBRef
Customer customer;
... other stuff
}
public class Customer
{
#Id
String id;
String name;
...
}
And the following repository (similar one for customer)
#RepositoryRestResource(excerptProjection = OrderSummary.class)
public interface OrderRestRepository extends MongoRepositor<Order,String>{}
The rest api returns the following JSON:
{
"id" : 4,
**other stuff**,
"_links" : {
"self" : {
"href" : "http://localhost:12345/api/orders/4"
},
"customer" : {
"href" : "http://localhost:12345/api/orders/4/customer"
}
}
}
Which if loaded correctly by the resttemplate will create a new Order instance with customer = null
Is it possible to eagerly resolve the customer on the repository end and embed the JSON?
Eagerly resolving dependent entities in this case will raise most probably N+1 database access problem.
I don't think there is a way to do that using default Spring Data REST/Mongo repositories implementation.
Here are some alternatives:
Construct an own custom #RestController method that would access the database and construct desired output
Use Projections to populate fields from related collection, e.g.
#Projection(name = "main", types = Order.class)
public interface OrderProjection {
...
// either
#Value("#{customerRepository.findById(target.customerId)}")
Customer getCustomer();
// or
#Value("#{customerService.getById(target.customerId)}")
Customer getCustomer();
// or
CustomerProjection getCustomer();
}
#Projection(name = "main", types = Customer.class)
public interface CustomerProjection {
...
}
The customerService.getById can employ caching (e.g. using Spring #Cachable annotation) to mitigate the performance penalty of accessing the database additionally for each result set record.
Add redundancy to your data model and store copies of the Customer object fields in the Order collection on creation/update.
This kind of problem arises, in my opinion, because MongoDB doesn't support joining different document collections very well (its "$lookup" operator has significant limitations in comparison to the common SQL JOINs).
MongoDB docs also do not recommend using #DBRef fields unless joining collections hosted in distinct servers:
Unless you have a compelling reason to use DBRefs, use manual references instead.
Here's also a similar question.