I am using loopback4 with MongoDB.
I have a counter property in my model, and would like to do atomic operation to increment/decrement that counter.
My implementation now is using ExtendedOperators $inc to add or sub calculations.
But I found that despite of my jsonSchema set to minimum:0, the counter will be negative value when $inc:{counter:-1} at counter value 0.
I think I can use Mongo Document Validation to set value range constraint, But I cant find the right way in repository code to do this.
Should I set it manually through Mongo Shell?
But how can I do error handling?
model
#property({
type: 'number',
jsonSchema:{
minimum:0,
}
})
count: number;
controller
async incrementCountById(
#param.path.string('id') id: string
): Promise<void> {
await this.userRepository.updateById(id, {$inc: {count: -1} });
}
repository
const config = {
allowExtendedOperators: true,
};
Any advice would be appreciated:)
Related
I have a prop called duration which is declared as a number in the mongoose schema with the purpose of storing a duration in seconds:
const mySchema = new mongoose.Schema(
{
...
duration: { type: Number, required: true }
...
},
{ timestamps: true },
)
After using the method findOne() and applying the lean() method, the prop duration is returned as a timestamp when it was set as a number. It happens when the number is greater than 1000.
const myVideo = await Models.Video.findOne({ _id: videoId })
.populate({ path: 'segment', populate: { path: 'type' } })
.lean()
.exec()
When I set: { "duration": 6000 } I get: { "duration": "1970-01-01T00:00:06.000Z" }
WHAT I'VE TRIED SO FAR
Besides trying to find the source of the issue, this is what I tried in the code:
I tried upgrading the Mongoose version from 5.9.15 to 5.12.7 to see if a fix was added for this but nothing changed.
Tried removing the { timestamp: true } from the schema, didn't work either.
Also tried adding other props or options like { lean: true } but at the end the result wasn't that different because I did stopped getting the timestamp but the returned object was a mongoose document instead of a plain old javascript object.
MY TEMPORARY SOLUTION
The temporary solution that I found for this was removing the lean() from the chain, but I still couldn't understand what was causing this.
we upgraded (from MongoDB 3.4) to:
MongoDB: 4.2.8
Mongoose: 5.9.10
and now we receive those errors. For the smallest example the models are:
[company.js]
'use strict';
const Schema = require('mongoose').Schema;
module.exports = new Schema({
name: {type: String, required: true},
}, {timestamps: true});
and
[target_group.js]
'use strict';
const Schema = require('mongoose').Schema;
module.exports = new Schema({
title: {
type: String,
required: true,
index: true,
},
minAge: Number,
maxAge: Number,
companies: [Company],
}, {timestamps: true});
and when I try to update the company within a targetgroup
_updateTargetGroup(companyId, company) {
return this.targetGroup.update(
{'companies._id': companyId},
{$set: {'companies.$': company}},
{multi: true});
}
I receive
MongoError: Updating the path 'companies.$.updatedAt' would create a conflict at 'companies.$'
even if I prepend
delete company.updatedAt;
delete company.createdAt;
I get this error.
If I try similar a DB Tool (Robo3T) everything works fine:
db.getCollection('targetgroups').update(
{'companies.name': "Test 1"},
{$set: {'companies.$': {name: "Test 2"}}},
{multi: true});
Of course I could use the workaround
_updateTargetGroup(companyId, company) {
return this.targetGroup.update(
{'companies._id': companyId},
{$set: {'companies.$.name': company.name}},
{multi: true});
}
(this is working in deed), but I'd like to understand the problem and we have also bigger models in the project with same issue.
Is this a problem of the {timestamps: true}? I searched for an explanation but werenot able to find anything ... :-(
The issue originates from using the timestamps as you mentioned but I would not call it a "bug" as in this instance I could argue it's working as intended.
First let's understand what using timestamps does in code, here is a code sample of what mongoose does to an array (company array) with timestamps: (source)
for (let i = 0; i < len; ++i) {
if (updatedAt != null) {
arr[i][updatedAt] = now;
}
if (createdAt != null) {
arr[i][createdAt] = now;
}
}
This runs on every update/insert. As you can see it sets the updatedAt and createdAt of each object in the array meaning the update Object changes from:
{$set: {'companies.$.name': company.name}}
To:
{
"$set": {
"companies.$": company.name,
"updatedAt": "2020-09-22T06:02:11.228Z", //now
"companies.$.updatedAt": "2020-09-22T06:02:11.228Z" //now
},
"$setOnInsert": {
"createdAt": "2020-09-22T06:02:11.228Z" //now
}
}
Now the error occurs when you try to update the same field with two different values/operations, for example if you were to $set and $unset the same field in the same update Mongo does not what to do hence it throws the error.
In your case it happens due to the companies.$.updatedAt field. Because you're updating the entire object at companies.$, that means you are basically setting it to be {name: "Test 2"} this also means you are "deleting" the updatedAt field (amongst others) while mongoose is trying to set it to be it's own value thus causing the error. This is also why your change to companies.$.name works as you would only be setting the name field and not the entire object so there's no conflict created.
I have the following query:
const messageRules = await MessageRule.findOne({
reservationLength: {$exists: false}
});
on the following schema:
const MessageRule = new Schema(
{
...,
reservationLength: {type: Number, default: 1},
...
}
);
And the query returns a document with:
{
...,
reservationLength: 1,
...
}
I'm going crazy here. Does it have something to do with the default setting in my schema? Any other ideas?
Its a bug i've encountered with mongoose several times already and i did not find too much information about it (granted i decided not to waste time exploring it).
It occurs with all Default value'd fields, mongoose just automatically sets these values to their defaulted value on the return call (if you check the actual document in the database it will not have this field set).
One easy fix to ease the nerve is to add lean() to the call:
const messageRules = await MessageRule.findOne({
reservationLength: {$exists: false}
}).lean();
For some reason this ends up fixing the bug (debatably feature ???)
I'm trying to check that an update command accomplished but when I check for the nModified I'm getting 0 although I do see that the field value changes from one value to another (not kept the same value).
static async updateProfile(username, profileData) {
const usersCollection = db.dbConnection.collection(dbConfig.collectionNames.users);
const updateRes = await usersCollection.update({email: username},
{"$set": {
firstName: profileData.firstName,
lastName: profileData.lastName,
payment: profileData.payment,
}
});
return updateRes.result.nModified > 0;
}
Is there another way to verify the update?
One of the way by findAndModify method:
You can easily compare whole new object and verify each key.
db.getCollection('usertests').findAndModify({
query: {"email":"xxx#xxx.com"},
update: {name: "HHH", "email":"xxx#xxx.com"},
new: true
})
new: true is responsible to return whole updated document. If fail to update it will return null.
Take care here to pass the whole document while update.
update() only return a number of the documents that were successfully updated. So, your logic to check if updated successfully or not is also valid.
I was using sailsjs 0.12. It supported index attributes on models, also
i was using npm package Sails-hooks-mongoat to create inverse indexes and so.
It wasn't ideal, but it worked. Right now they dropped the index attribute and mongoat is currently unsafe and pending updates to work on Sails.js 1.0.
I would like to know the best approach to:
Create Indexes on new deployments.
Migrate (ensure?) indexes on deployment updates.
Since you are not allowed to run 'migrate: alter' in production (even if you try), one option is to create those index in bootstrap file ('config/bootstrap.js').
Imagine you have an User model like this:
var User = {
attributes: {
email : { type: 'string', unique: true, required: true },
username : { type: 'string', unique: true, required: true },
pin: { type: 'string'}
}
};
module.exports = User;
Then you can manually create the missing indexes like this in bootstrap file:
module.exports.bootstrap = async function(done) {
console.log("Loading bootstrap...")
if (process.env.NODE_ENV === 'test') {
}
if (process.env.NODE_ENV === 'production') {
console.log("CREATING DATABASE INDEX BY HAND")
// USER MODEL
var db = User.getDatastore().manager;
await db.collection(User.tableName).createIndex( { email: 1 }, {unique: true} );
await db.collection(User.tableName).createIndex( { username: 1 }, {unique: true} );
// PANDA MODEL
// db = Panda.getDatastore().manager;
// await db.collection(Panda.tableName).createIndex( { name: 1 }, {unique: true} );
}
// await initializeDatabase() // custom DB initialization...
return done();
};
The index are created only once, subsequent runs will not recreate those indexes.
The ensureIndex was an alias to createIndex function and it has been deprecated.
References:
Waterline manager reference
MongoDB create index reference
In development mode you can specify custom indexes in the model within Sails, or it will remove them when lifting the server and performing migration.
My (preferred) alternative approach is to manage all indexes on my own within the DB. In order to do so, you have to change the "migrate" attribute from "alter" to "safe" in models.js config file.
Note that in production mode "migrate" attribute is always set to "safe".