MongoDB $inc value by 1 and set max - mongodb

i have a question, I need to increment a field in DB by 1, thats fine i know how to do it, but i need to set max to 5, i mean it like if value in DB is 5, dont update value to 6, i have this code:
collectionName.update({
name: { $in: namesField },
}, { $inc: { fieldName: 1 } }, { multi: true });
i tried this one and many others, but does not work:
collectionName.update({
name: { $in: namesField },
}, { fieldName: { $lt: 6 }, { $inc: { fieldName: 1 } }, { multi: true });
Thank you for every idea

It is just a matter of finding only relevant documents, i.e. with fieldName smaller than 5, and then update only them, as you tried. But the finding part should be on the first argument, since the update syntax expecting query, update, options, as can be seen here
collectionName.update({
name: { $in: namesField }, fieldName: { $lt: 5 }}, { $inc: { fieldName: 1 } }, { multi: true });

Related

MongoDB - Update the value of one field with the value of another nested field

I am trying to run a MongoDB query to update the value of one field with the value of another nested field. I have the following document:
{
"name": "name",
"address": "address",
"times": 10,
"snapshots": [
{
"dayTotal": 2,
"dayHit": 2,
"dayIndex": 2
},
{
"dayTotal": 3,
"dayHit": 3,
"dayIndex": 3
}
]
}
I am trying like this:
db.netGraphMetadataDTO.updateMany(
{ },
[{ $set: { times: "$snapshots.$[elem].dayTotal" } }],
{
arrayFilters: [{"elem.dayIndex":{"$eq": 2}}],
upsert: false,
multi: true
}
);
but got an error:
arrayFilters may not be specified for pipeline-syle updates
You can't use arrayFilters with aggregation pipeline for update query at the same time.
Instead, what you need to do:
Get the dayTotal field from the result 2.
Take the first matched document from the result 3.
Filter the document from snapshots array.
db.netGraphMetadataDTO.updateMany({},
[
{
$set: {
times: {
$getField: {
field: "dayTotal",
input: {
$first: {
$filter: {
input: "$snapshots",
cond: {
$eq: [
"$$this.dayIndex",
2
]
}
}
}
}
}
}
}
}
],
{
upsert: false,
multi: true
})
Demo # Mongo Playground

how to update the value inside object with increment process mongodb

I have a document
"watched": {
"1": true,
"2": true, //should add 3 with true value
},
"fans": 4 //should increment by 5 it should be 9
when ever I do update it should increase the key and value and update the fans increment by 5
db.getCollection('movies').updateOne({
_id: Object_id("60e80c96b9c55e7a01898f5c")
}, { $set: { "watched.3": true} } },
{ $inc: { fans: 5 } }
)
EITHER it does update or increment how to manage both
db.collection.update({
_id: ObjectId("60e80c96b9c55e7a01898f5c"),
"watched.3": {
$exists: false
}
},
{
$set: {
"watched.3": true
},
$inc: {
fans: 5
}
})
mongoplayground

Mongo: Upsert to copy all static fields to new document atomically

I'm using the Bucket Pattern to limit documents' array size to maxBucketSize elements. Once a document's array elements is full (bucketSize = maxBucketSize), the next update will create a new document with a new array to hold more elements using upsert.
How would you copy the static fields (see below recordType and recordDesc) from the last full bucket with a single call?
Sample document:
// records collection
{
recordId: 12345, // non-unique index
recordType: "someType",
recordDesc: "Some record description.",
elements: [ { a: 1, b: 2 }, { a: 3, b: 4 } ]
bucketsize: 2,
}
Bucket implementation (copies only queried fields):
const maxBucketSize = 2;
db.collection('records').updateOne(
{
recordId: 12345,
bucketSize: { $lt: maxBucketSize } // false
},
{
$push: { elements: { a: 5, b: 6 } },
$inc: { bucketSize: 1 },
$setOnInsert: {
// Should be executed because bucketSize condition is false
// but fields `recordType` and `recordDesc` inaccessible as
// no documents matched and were not included in the query
}
},
{ upsert: true }
)
Possible solution
To make this work, I can always make two calls, findOne() to get static values and then updateOne() where I set fields with setOnInsert, but it's inefficient
How can I modify this as one call with an aggregate? Examining one (last added) document matching recordId (index), evaluate if array is full, and add new document.
Attempt:
// Evaluate last document added
db.collection('records').findOneAndUpdate(
{ recordId: 12345 },
{
$push: { elements: {
$cond: {
if: { $lt: [ '$bucketSize', maxBucketSize ] },
then: { a: 5, b: 6 }, else: null
}
}},
$inc: { bucketSize: {
$cond: {
if: { $lt: [ '$bucketSize', maxBucketSize ] },
then: 1, else: 0
}
}},
$setOnInsert: {
recordType: '$recordType',
recordDesc: '$recordDesc'
}
},
{
sort: { $natural: -1 }, // last document
upsert: true, // Bucket overflow
}
)
This comes back with:
MongoError: Cannot increment with non-numeric argument: { bucketSize: { $cond: { if: { $lt: [ "$bucketSize", 2 ] }, then: 1, else: 0 } }}

Unable to add a object as a Sub field for another Existing JSON object

I am unable to add the following object:
[
{ 'option1':'opt1','option2':'opt2','option3':'opt3'},
]
as a sub filed to:
const question_list=await Questions.find({ $and: [{categoryid:categoryId},{ isDeleted: false }, { status: 0 }] }, { name: 1 });
question_list=[{"_id":"5eb167fb222a6e11fc6fe579","name":"q1"},{"_id":"5eb1680abb913f2810774c2a","name":"q2"},{"_id":"5eb16b5686068831f07c65c3","name":"q5"}]
I want the final Object to be as:
[{"_id":"5eb167fb222a6e11fc6fe579","name":"q1","options":[
{ 'option1':'opt1','option2':'opt2','option3':'opt3'},
]},{"_id":"5eb1680abb913f2810774c2a","name":"q2","options":[
{ 'option1':'opt1','option2':'opt2','option3':'opt3'},
]},{"_id":"5eb16b5686068831f07c65c3","name":"q5","options":[
{ 'option1':'opt1','option2':'opt2','option3':'opt3'},
]}]
what is the best possible solution?
You need to use aggregation-pipeline to do this, instead of .find(). As projection in .find() can only accept $elemMatch, $slice, and $ on existing fields : project-fields-from-query-results. So to add a new field with new data to documents, use $project in aggregation framework.
const question_list = await Questions.aggregate([
{
$match: {
$and: [{ categoryid: categoryId }, { isDeleted: false }, { status: 0 }]
}
},
{
$project: {
_id: 0,
name: 1,
options: [{ option1: "opt1", option2: "opt2", option3: "opt3" }]
}
}
]);
Test : mongoplayground

Mongodb $expr query is very slow

I have a Mongo 4.2.0 instance here on my development environment with a simple collection of only 300 entries.
I've build some basic queue handling juggling with some date fields.
To get an document that should be updated I have the following $expr-query, which runs very slow imho.
db.collection("myupdates").findOneAndUpdate({
$expr: {
$and: [
{ $gt: ["$shouldUpdate", "$updatedAt"] },
{ $gt: ["$shouldUpdate", "$isUpdatingAt"] },
{ $gt: ["$shouldUpdate", "$updateErroredAt"] },
]
},
}, {
$set: {
isUpdatingAt: new Date(),
},
});
This query takes around ~120ms after warmup on my standard year 2019 laptop. Where my other simple queries only take ~3ms.
Although it doesn't really matter to set indexes with 300 documents, I've tried of course to set them all. Single to compound indexes. This does not do the trick.
It's also not the findOneAndUpdate, with countDocuments I achieve the same slow speed.
Is this the normal speed of an $expr or aggregation syntax? What did I wrong? Is there a better way to achieve this? Do I have to use Redis for this use case?
Possible solution
As #Neil Lunn pointed out in the answers, calculated conditions do not utilize an index and should be the last resort.
So I just got rid of the calculated condition by splitting the query into 2 queries. The first query is getting an actual value I can match with.
These 2 queries boil down to ~10ms total, which is much better then 120ms.
const shouldUpdateDateResult = await mongo.db.collection("myupdates").findOne({
shouldUpdate: { $exists: true }
}, {
shouldUpdate: 1,
});
const shouldUpdateDate = shouldUpdateDateResult && shouldUpdateDateResult.shouldUpdate;
const result = await mongo.db.collection("myupdates").findOneAndUpdate({
$and: [
{ shouldUpdate: shouldUpdateDate },
{ $or: [
{ updatedAt: { $eq: null } },
{ updatedAt: { $exists: false } },
{ updatedAt: { $lte: shouldUpdateDate } }
] },
{ $or: [
{ isUpdatingAt: { $eq: null } },
{ isUpdatingAt: { $exists: false } },
{ isUpdatingAt: { $lte: shouldUpdateDate } }
] },
{ $or: [
{ updateErroredAt: { $eq: null } },
{ updateErroredAt: { $exists: false } },
{ updateErroredAt: { $lte: shouldUpdateDate } }
] },
],
}, {
$set: {
isUpdatingAt: new Date(),
},
});
The whole idea behind this is a processing queue usable by multiple workers.