MongoDB update using limit and skip - mongodb

In Mongoose, I would need to do the following in the quickest possible way:
find documents by a query
use limit() and skip()
update a field in all the found records with the same value
what do you suggest?

you can get records and then get id's of all the records into one array and then
update them using $in
async function someAsyncFunction(){
let foundData= await collection.find(query).skip().limit();
let IDs=[];
foundData.forEach(element=>{
IDs.push(element._id);
});
return Collection.update(
{_id: { $in: IDs}},
{ $set: {"fieldToUpdate": "value"}},
{ multi: true }
);
}

Related

Aggregate and update with mongoose

I want to use the aggregate method to make my querys and modify my database by a value, I tried with $set but my database is not modified.
Here is how I do my query:
var filter = req.body.filter
var search = [ { $match: filter }, { $set: {item: "2"} }, { $sample: { size: 1 } }]
const result = await dataModel.aggregate(search)
I know there is also findOneAndUpdate but I would like to keep aggregate because I also want to use $project in my pipelines
thanks in advance !
You can use FindOneAndUpdate for change your db

Aggregation query with $set in findOneAndUpdate doesn't update document

I am using node 11.6, mongodb 4.2.5, mongoose 4.3.17. I am trying to update a field by adding a string to the end of it. I first updated to mongo 4.2 which I apparently needed to use aggregation pipelines in updates.
I tried following this post like this:
var update = [{$set: {slug: {$concat: ['$slug', '-rejected']}}}];
Content.findOneAndUpdate({_id: id}, update, {new: true}, (err, doc) => {
//
});
but when I ran it I got no error, no document returned, and it was not updated.
So I removed the outer [], and passed just an object like this:
var update = {$set: {slug: {$concat: ['$slug', '-rejected']}}}
Content.findOneAndUpdate({_id: id}, update, {new: true}, (err, doc) => {
//
});
And I receive this error message:
`Cast to string failed for value "{ '$concat': [ '$slug', '-rejected' ] }" at path "slug"`,
What does this mean? How can I accomplish this? (without two separate calls)
Running the same function but replacing update with:
var update = {slug: 'test-slug'}
successfully updates the slug to 'test-slug'.
But trying to do the same simple thing with an aggregation acts much like my previous attempt, no error or change:
var update = [{$set: {slug: 'test-sluggy'}}]
Using updateOne() instead of findOneAndUpdate() doesn't change anything either.
The only thing I can think that could cause it is the mongoose version, but it seems like there's a lot of changes between 4 and 5 and I don't want to update unless I have to, but I can't find anything that says it would change anything.
The pipeline form requires that the update be an array of pipeline stages.
Try wrapping your existing update in [] like
var update = [{$set: {slug: {$concat: ['$slug', '-rejected']}}}]
we can use something like that
var update = [{ $set: { slug: { $concat: ['$slug', '-rejected'] } } }]
starting from mongo versions > 4.2, the update operations can accept an aggregation pipeline, so we are able to update some field based on its current value:
the whole query may be something like that
Content.findOneAndUpdate(
{ _id: id }, // the find criteria
[{ $set: { slug: { $concat: ['$slug', '-rejected'] } } }], // this is the update aggregation pipeline, see the square brackets
{ multi: true }, // this should be set to true if you have more than one document to update,
// otherwise, only the first matching document will be updated,
// this is in case you use update rather than findOneAndUpdate,
// but here we have only one document to update, so we can ignore it
(err, doc) => {
//
});

MongoDB Update from Aggregate

I am extremely new to Mongo and need some up creating an update statement. I have two different collections. I need to update the one collection's values with the results from my aggregate query where the id's match.
Here is my aggregate query that gives me the id for the other collection and the value I need to set it to:
db.ResultsCollection.aggregate(
{$group:{_id:"$SystemId", "maxValue": {$max:"$LastModified"}}}
);
How do I loop through the other collection with this data and update where the _id matches the SystemId from my aggreagate?
UPDATED CODE:
db.ResultsCollection.aggregate(
{$group:{_id:"$SystemId", "maxValue": {$max:"$LastModified"}}}
).forEach(function(
db.CollectionToUpdate.updateOne(
{ _id : doc._id },
{ $set: {UpdateDate: doc.maxValue } },
{ upsert: false }
);
});
My updated code does not generate a syntax error, but does not update the results when I refresh.

MongoDB update collection's data

I try to update an MongoDB data by using this code:
db.medicines.update({"_id":"586a048e34e5c12614a7424a"}, {$set: {amount:'3'}})
but unfortantly the query does not recognize the selector "_id":"586a048e34e5c12614a7424a", even if its exists.
Its succsed when I change the key to another like: name,rate and etc..
there is a special way to use update with _id parameter?
Thanks a head.
_id will be the unique ObjectId that mongodb generates for every document before inserting it. The query dint work because _id is an ObjectId and "586a048e34e5c12614a7424a" is a String. You need to wrap _id with ObjectId().
If you're using mongodb query
db.medicines.update({
"_id": ObjectId("586a048e34e5c12614a7424a")
}, {
$set: {
amount: '3'
}
});
If you are using mongoose. You can use findByIdAndUpdate
db.medicines.findByIdAndUpdate({
"_id": "586a048e34e5c12614a7424a"
}, {
$set: {
amount: '3'
}
});

How to limit number of updating documents in mongodb

How to implement somethings similar to db.collection.find().limit(10) but while updating documents?
Now I'm using something really crappy like getting documents with db.collection.find().limit() and then updating them.
In general I wanna to return given number of records and change one field in each of them.
Thanks.
You can use:
db.collection.find().limit(NUMBER_OF_ITEMS_YOU_WANT_TO_UPDATE).forEach(
function (e) {
e.fieldToChange = "blah";
....
db.collection.save(e);
}
);
(Credits for forEach code: MongoDB: Updating documents using data from the same document)
What this will do is only change the number of entries you specify. So if you want to add a field called "newField" with value 1 to only half of your entries inside "collection", for example, you can put in
db.collection.find().limit(db.collection.count() / 2).forEach(
function (e) {
e.newField = 1;
db.collection.save(e);
}
);
If you then want to make the other half also have "newField" but with value 2, you can do an update with the condition that newField doesn't exist:
db.collection.update( { newField : { $exists : false } }, { $set : { newField : 2 } }, {multi : true} );
Using forEach to individually update each document is slow. You can update the documents in bulk using
ids = db.collection.find(<condition>).limit(<limit>).map(
function(doc) {
return doc._id;
}
);
db.collection.updateMany({_id: {$in: ids}}, <update>})
The solutions that iterate over all objects then update them individually are very slow.
Retrieving them all then updating simultaneously using $in is more efficient.
ids = People.where(firstname: 'Pablo').limit(10000).only(:_id).to_a.map(&:id)
People.in(_id: ids).update_all(lastname: 'Cantero')
The query is written using Mongoid, but can be easily rewritten in Mongo Shell as well.
Unfortunately the workaround you have is the only way to do it AFAIK. There is a boolean flag multi which will either update all the matches (when true) or update the 1st match (when false).
As the answer states there is still no way to limit the number of documents to update (or delete) to a value > 1. A workaround to use something like:
db.collection.find(<condition>).limit(<limit>).forEach(function(doc){db.collection.update({_id:doc._id},{<your update>})})
If your id is a sequence number and not an ObjectId you can do this in a for loop:
let batchSize= 10;
for (let i = 0; i <= 1000000; i += batchSize) {
db.collection.update({$and :[{"_id": {$lte: i+batchSize}}, {"_id": {$gt: i}}]}),{<your update>})
}
let fetchStandby = await db.model.distinct("key",{});
fetchStandby = fetchStandby.slice(0, no_of_docs_to_be_updated)
let fetch = await db.model.updateMany({
key: { $in: fetchStandby }
}, {
$set:{"qc.status": "pending"}
})
I also recently wanted something like this. I think querying for a long list of _id just to update in an $in is perhaps slow too, so I tried to use an aggregation+merge
while (true) {
const record = db.records.findOne({ isArchived: false }, {_id: 1})
if (!record) {
print("No more records")
break
}
db.records.aggregate([
{ $match: { isArchived: false } },
{ $limit: 100 },
{
$project: {
_id: 1,
isArchived: {
$literal: true
},
updatedAt: {
$literal: new Date()
}
}
},
{
$merge: {
into: "records",
on: "_id",
whenMatched: "merge"
}
}
])
print("Done update")
}
But feel free to comment if this is better or worse that a bulk update with $in.