In mongoDb, how do you remove an array element by its index? - mongodb

In the following example, assume the document is in the db.people collection.
How to remove the 3rd element of the interests array by it's index?
{
"_id" : ObjectId("4d1cb5de451600000000497a"),
"name" : "dannie",
"interests" : [
"guitar",
"programming",
"gadgets",
"reading"
]
}
This is my current solution:
var interests = db.people.findOne({"name":"dannie"}).interests;
interests.splice(2,1)
db.people.update({"name":"dannie"}, {"$set" : {"interests" : interests}});
Is there a more direct way?

There is no straight way of pulling/removing by array index. In fact, this is an open issue http://jira.mongodb.org/browse/SERVER-1014 , you may vote for it.
The workaround is using $unset and then $pull:
db.lists.update({}, {$unset : {"interests.3" : 1 }})
db.lists.update({}, {$pull : {"interests" : null}})
Update: as mentioned in some of the comments this approach is not atomic and can cause some race conditions if other clients read and/or write between the two operations. If we need the operation to be atomic, we could:
Read the document from the database
Update the document and remove the item in the array
Replace the document in the database. To ensure the document has not changed since we read it, we can use the update if current pattern described in the mongo docs

You can use $pull modifier of update operation for removing a particular element in an array. In case you provided a query will look like this:
db.people.update({"name":"dannie"}, {'$pull': {"interests": "guitar"}})
Also, you may consider using $pullAll for removing all occurrences. More about this on the official documentation page - http://www.mongodb.org/display/DOCS/Updating#Updating-%24pull
This doesn't use index as a criteria for removing an element, but still might help in cases similar to yours. IMO, using indexes for addressing elements inside an array is not very reliable since mongodb isn't consistent on an elements order as fas as I know.

in Mongodb 4.2 you can do this:
db.example.update({}, [
{$set: {field: {
$concatArrays: [
{$slice: ["$field", P]},
{$slice: ["$field", {$add: [1, P]}, {$size: "$field"}]}
]
}}}
]);
P is the index of element you want to remove from array.
If you want to remove from P till end:
db.example.update({}, [
{ $set: { field: { $slice: ["$field", 1] } } },
]);

Starting in Mongo 4.4, the $function aggregation operator allows applying a custom javascript function to implement behaviour not supported by the MongoDB Query Language.
For instance, in order to update an array by removing an element at a given index:
// { "name": "dannie", "interests": ["guitar", "programming", "gadgets", "reading"] }
db.collection.update(
{ "name": "dannie" },
[{ $set:
{ "interests":
{ $function: {
body: function(interests) { interests.splice(2, 1); return interests; },
args: ["$interests"],
lang: "js"
}}
}
}]
)
// { "name": "dannie", "interests": ["guitar", "programming", "reading"] }
$function takes 3 parameters:
body, which is the function to apply, whose parameter is the array to modify. The function here simply consists in using splice to remove 1 element at index 2.
args, which contains the fields from the record that the body function takes as parameter. In our case "$interests".
lang, which is the language in which the body function is written. Only js is currently available.

Rather than using the unset (as in the accepted answer), I solve this by setting the field to a unique value (i.e. not NULL) and then immediately pulling that value. A little safer from an asynch perspective. Here is the code:
var update = {};
var key = "ToBePulled_"+ new Date().toString();
update['feedback.'+index] = key;
Venues.update(venueId, {$set: update});
return Venues.update(venueId, {$pull: {feedback: key}});
Hopefully mongo will address this, perhaps by extending the $position modifier to support $pull as well as $push.

I would recommend using a GUID (I tend to use ObjectID) field, or an auto-incrementing field for each sub-document in the array.
With this GUID it is easy to issue a $pull and be sure that the correct one will be pulled. Same goes for other array operations.

For people who are searching an answer using mongoose with nodejs. This is how I do it.
exports.deletePregunta = function (req, res) {
let codTest = req.params.tCodigo;
let indexPregunta = req.body.pregunta; // the index that come from frontend
let inPregunta = `tPreguntas.0.pregunta.${indexPregunta}`; // my field in my db
let inOpciones = `tPreguntas.0.opciones.${indexPregunta}`; // my other field in my db
let inTipo = `tPreguntas.0.tipo.${indexPregunta}`; // my other field in my db
Test.findOneAndUpdate({ tCodigo: codTest },
{
'$unset': {
[inPregunta]: 1, // put the field with []
[inOpciones]: 1,
[inTipo]: 1
}
}).then(()=>{
Test.findOneAndUpdate({ tCodigo: codTest }, {
'$pull': {
'tPreguntas.0.pregunta': null,
'tPreguntas.0.opciones': null,
'tPreguntas.0.tipo': null
}
}).then(testModificado => {
if (!testModificado) {
res.status(404).send({ accion: 'deletePregunta', message: 'No se ha podido borrar esa pregunta ' });
} else {
res.status(200).send({ accion: 'deletePregunta', message: 'Pregunta borrada correctamente' });
}
})}).catch(err => { res.status(500).send({ accion: 'deletePregunta', message: 'error en la base de datos ' + err }); });
}
I can rewrite this answer if it dont understand very well, but I think is okay.
Hope this help you, I lost a lot of time facing this issue.

It is little bit late but some may find it useful who are using robo3t-
db.getCollection('people').update(
{"name":"dannie"},
{ $pull:
{
interests: "guitar" // you can change value to
}
},
{ multi: true }
);
If you have values something like -
property: [
{
"key" : "key1",
"value" : "value 1"
},
{
"key" : "key2",
"value" : "value 2"
},
{
"key" : "key3",
"value" : "value 3"
}
]
and you want to delete a record where the key is key3 then you can use something -
db.getCollection('people').update(
{"name":"dannie"},
{ $pull:
{
property: { key: "key3"} // you can change value to
}
},
{ multi: true }
);
The same goes for the nested property.

this can be done using $pop operator,
db.getCollection('collection_name').updateOne( {}, {$pop: {"path_to_array_object":1}})

Related

In mongodb know index of array element matched with $in operator?

I am using aggregation with mongoDB now i am facing a problem here, i am trying to match my documents which are present in my input array by using $in operator. Now i want to know the index of the lement from the input array now can anyone please tell me how can i do that.
My code
var coupon_ids = ["58455a5c1f65d363bd5d2600", "58455a5c1f65d363bd5d2601","58455a5c1f65d363bd5d2602"]
couponmodel.aggregate(
{ $match : { '_id': { $in : coupons_ids }} },
/* Here i want to know index of coupon_ids element that is matched because i want to perform some operation in below code */
function(err, docs) {
if (err) {
} else {
}
});
Couponmodel Schema
var CouponSchema = new Schema({
category: {type: String},
coupon_name: {type: String}, // this is a string
});
UPDATE-
As suggested by user3124885 that aggregation is not better in performance, can anyone please tell me the performance difference between aggregation and normal query in mongodb. And which one is better ??
Update-
I read this question on SO mongodb-aggregation-match-vs-find-speed. Here the user himself commented that both take same time, also by seeing vlad-z answer i think aggregation is better. Please if anyone of you have worked on mongodb Then please tell me what are your opinion about this.
UPDATE-
I used sample json data containing 30,000 rows and tried match with aggregation v/s find query aggregation got executed in 180 ms where find query took 220ms. ALso i ran $lookup it is also taking not much than 500ms so think aggregation is bit faster than normal query. Please correct me guys if any one of you have tried using aggregation and if not then why ??
UPDATE-
I read this post where user uses below code as a replacement of $zip SERVER-20163 but i am not getting how can i solve my problem using below code. So can anybody please tell me how can i use below code to solve my issue.
{$map: {
input: {
elt1: "$array1",
elt2: "$array2"
},
in: ["$elt1", "$elt2"]
}
Now can anyone please help me, it would be really be a great favor for me.
So say we have the following in the database collection:
> db.couponmodel.find()
{ "_id" : "a" }
{ "_id" : "b" }
{ "_id" : "c" }
{ "_id" : "d" }
and we wish to search for the following ids in the collections
var coupons_ids = ["c", "a" ,"z"];
We'll then have to build up a dynamic projection state so that we can project the correct indexes, so we'll have to map each id to its corresponding index
var conditions = coupons_ids.map(function(value, index){
return { $cond: { if: { $eq: ['$_id', value] }, then: index, else: -1 } };
});
Then we can then inject this in to our aggregation pipeline
db.couponmodel.aggregate([
{ $match : { '_id' : { $in : coupons_ids } } },
{ $project: { indexes : conditions } },
{ $project: {
index : {
$filter: {
input: "$indexes", as: "indexes", cond: { $ne: [ "$$indexes", -1 ] }
}
}
}
},
{ $unwind: '$index' }
]);
Running the above will now output each _id and it's corresponding index within the coupons_ids array
{ "_id" : "a", "index" : 1 }
{ "_id" : "c", "index" : 0 }
However we can also add more items in to the pipeline at the end and reference $index to get the current matched index.
I think you could do it in a faster way simply retrieving the array and search manually. Remember that aggregation don't give you performance.
//$match,$in,$and
$match:{
$and:[
{"uniqueID":{$in:["CONV0001"]}},
{"parentID":{$in:["null"]}},
]
}
}])

Rename a sub-document field within an Array

Considering the document below how can I rename 'techId1' to 'techId'. I've tried different ways and can't get it to work.
{
"_id" : ObjectId("55840f49e0b"),
"__v" : 0,
"accessCard" : "123456789",
"checkouts" : [
{
"user" : ObjectId("5571e7619f"),
"_id" : ObjectId("55840f49e0bf"),
"date" : ISODate("2015-06-19T12:45:52.339Z"),
"techId1" : ObjectId("553d9cbcaf")
},
{
"user" : ObjectId("5571e7619f15"),
"_id" : ObjectId("55880e8ee0bf"),
"date" : ISODate("2015-06-22T13:01:51.672Z"),
"techId1" : ObjectId("55b7db39989")
}
],
"created" : ISODate("2015-06-19T12:47:05.422Z"),
"date" : ISODate("2015-06-19T12:45:52.339Z"),
"location" : ObjectId("55743c8ddbda"),
"model" : "model1",
"order" : ObjectId("55840f49e0bf"),
"rid" : "987654321",
"serialNumber" : "AHSJSHSKSK",
"user" : ObjectId("5571e7619f1"),
"techId" : ObjectId("55b7db399")
}
In mongo console I tried which gives me ok but nothing is actually updated.
collection.update({"checkouts._id":ObjectId("55840f49e0b")},{ $rename: { "techId1": "techId" } });
I also tried this which gives me an error. "cannot use the part (checkouts of checkouts.techId1) to traverse the element"
collection.update({"checkouts._id":ObjectId("55856609e0b")},{ $rename: { "checkouts.techId1": "checkouts.techId" } })
In mongoose I have tried the following.
collection.findByIdAndUpdate(id, { $rename: { "checkouts.techId1": "checkouts.techId" } }, function (err, data) {});
and
collection.update({'checkouts._id': n1._id}, { $rename: { "checkouts.$.techId1": "checkouts.$.techId" } }, function (err, data) {});
Thanks in advance.
You were close at the end, but there are a few things missing. You cannot $rename when using the positional operator, instead you need to $set the new name and $unset the old one. But there is another restriction here as they will both belong to "checkouts" as a parent path in that you cannot do both at the same time.
The other core line in your question is "traverse the element" and that is the one thing you cannot do in updating "all" of the array elements at once. Well, not safely and without possibly overwriting new data coming in anyway.
What you need to do is "iterate" each document and similarly iterate each array member in order to "safely" update. You cannot really iterate just the document and "save" the whole array back with alterations. Certainly not in the case where anything else is actively using the data.
I personally would run this sort of operation in the MongoDB shell if you can, as it is a "one off" ( hopefully ) thing and this saves the overhead of writing other API code. Also we're using the Bulk Operations API here to make this as efficient as possible. With mongoose it takes a bit more digging to implement, but still can be done. But here is the shell listing:
var bulk = db.collection.initializeOrderedBulkOp(),
count = 0;
db.collection.find({ "checkouts.techId1": { "$exists": true } }).forEach(function(doc) {
doc.checkouts.forEach(function(checkout) {
if ( checkout.hasOwnProperty("techId1") ) {
bulk.find({ "_id": doc._id, "checkouts._id": checkout._id }).updateOne({
"$set": { "checkouts.$.techId": checkout.techId1 }
});
bulk.find({ "_id": doc._id, "checkouts._id": checkout._id }).updateOne({
"$unset": { "checkouts.$.techId1": 1 }
});
count += 2;
if ( count % 500 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderedBulkOp();
}
}
});
});
if ( count % 500 !== 0 )
bulk.execute();
Since the $set and $unset operations are happening in pairs, we are keeping the total batch size to 1000 operations per execution just to keep memory usage on the client down.
The loop simply looks for documents where the field to be renamed "exists" and then iterates each array element of each document and commits the two changes. As Bulk Operations, these are not sent to the server until the .execute() is called, where also a single response is returned for each call. This saves a lot of traffic.
If you insist on coding with mongoose. Be aware that a .collection acessor is required to get to the Bulk API methods from the core driver, like this:
var bulk = Model.collection.inititializeOrderedBulkOp();
And the only thing that sends to the server is the .execute() method, so this is your only execution callback:
bulk.exectute(function(err,response) {
// code body and async iterator callback here
});
And use async flow control instead of .forEach() such as async.each.
Also, if you do that, then be aware that as a raw driver method not governed by mongoose, you do not get the same database connection awareness as you do with mongoose methods. Unless you know for sure the database connection is already established, it is safter to put this code within an event callback for the server connection:
mongoose.connection.on("connect",function(err) {
// body of code
});
But otherwise those are the only real ( apart from call syntax ) alterations you really need.
This worked for me, I created this query to perform this procedure and I share it, (although I know it is not the most optimized way):
First, make an aggregate that (1) $match the documents that have the checkouts array field with techId1 as one of the keys of each sub-document. (2) $unwind the checkouts field (that deconstructs the array field from the input documents to output a document for each element), (3) adds the techId field (with $addFields), (4) $unset the old techId1 field, (5) $group the documents by _id to have again the checkout sub-documents grouped by its _id, and (6) write the result of these aggregation in a temporal collection (with $out).
const collection = 'yourCollection'
db[collection].aggregate([
{
$match: {
'checkouts.techId1': { '$exists': true }
}
},
{
$unwind: {
path: '$checkouts'
}
},
{
$addFields: {
'checkouts.techId': '$checkouts.techId1'
}
},
{
$project: {
'checkouts.techId1': 0
}
},
{
$group: {
'_id': '$_id',
'checkouts': { $push: { 'techId': '$checkouts.techId' } }
}
},
{
$out: 'temporal'
}
])
Then, you can make another aggregate from this temporal collection to $merge the documents with the modified checkouts field to your original collection.
db.temporal.aggregate([
{
$merge: {
into: collection,
on: "_id",
whenMatched:"merge",
whenNotMatched: "insert"
}
}
])

MongoDB conditionally $addToSet sub-document in array by specific field

Is there a way to conditionally $addToSet based on a specific key field in a subdocument on an array?
Here's an example of what I mean - given the collection produced by the following sample bootstrap;
cls
db.so.remove();
db.so.insert({
"Name": "fruitBowl",
"pfms" : [
{
"n" : "apples"
}
]
});
n defines a unique document key. I only want one entry with the same n value in the array at any one time. So I want to be able to update the pfms array using n so that I end up with just this;
{
"Name": "fruitBowl",
"pfms" : [
{
"n" : "apples",
"mState": 1111234
}
]
}
Here's where I am at the moment;
db.so.update({
"Name": "fruitBowl",
},{
// not allowed to do this of course
// "$pull": {
// "pfms": { n: "apples" },
// },
"$addToSet": {
"pfms": {
"$each": [
{
"n": "apples",
"mState": 1111234
}
]
}
}
}
)
Unfortunately, this adds another array element;
db.so.find().toArray();
[
{
"Name" : "fruitBowl",
"_id" : ObjectId("53ecfef5baca2b1079b0f97c"),
"pfms" : [
{
"n" : "apples"
},
{
"n" : "apples",
"mState" : 1111234
}
]
}
]
I need to effectively upsert the apples document matching on n as the unique identifier and just set mState whether or not an entry already exists. It's a shame I can't do a $pull and $addToSet in the same document (I tried).
What I really need here is dictionary semantics, but that's not an option right now, nor is breaking out the document - can anyone come up with another way?
FWIW - the existing format is a result of language/driver serialization, I didn't choose it exactly.
further
I've gotten a little further in the case where I know the array element already exists I can do this;
db.so.update({
"Name": "fruitBowl",
"pfms.n": "apples",
},{
$set: {
"pfms.$.mState": 1111234,
},
}
)
But of course that only works;
for a single array element
as long as I know it exists
The first limitation isn't a disaster, but if I can't effectively upsert or combine $addToSet with the previous $set (which of course I can't) then it the only workarounds I can think of for now mean two DB round-trips.
The $addToSet operator of course requires that the "whole" document being "added to the set" is in fact unique, so you cannot change "part" of the document or otherwise consider it to be a "partial match".
You stumbled on to your best approach using $pull to remove any element with the "key" field that would result in "duplicates", but of course you cannot modify the same path in different update operators like that.
So the closest thing you will get is issuing separate operations but also doing that with the "Bulk Operations API" which is introduced with MongoDB 2.6. This allows both to be sent to the server at the same time for the closest thing to a "contiguous" operations list you will get:
var bulk = db.so.initializeOrderedBulkOp();
bulk.find({ "Name": "fruitBowl", "pfms.n": "apples": }).updateOne({
"$pull": { "pfms": { "n": "apples" } }
});
bulk.find({ "Name": "fruitBowl" }).updateOne({
"$push": { "pfms": { "n": "apples", "state": 1111234 } }
})
bulk.execute();
That pretty much is your best approach if it is not possible or practical to move the elements to another collection and rely on "upserts" and $set in order to have the same functionality but on a collection rather than array.
I have faced the exact same scenario. I was inserting and removing likes from a post.
What I did is, using mongoose findOneAndUpdate function (which is similar to update or findAndModify function in mongodb).
The key concept is
Insert when the field is not present
Delete when the field is present
The insert is
findOneAndUpdate({ _id: theId, 'likes.userId': { $ne: theUserId }},
{ $push: { likes: { userId: theUserId, createdAt: new Date() }}},
{ 'new': true }, function(err, post) { // do the needful });
The delete is
findOneAndUpdate({ _id: theId, 'likes.userId': theUserId},
{ $pull: { likes: { userId: theUserId }}},
{ 'new': true }, function(err, post) { // do the needful });
This makes the whole operation atomic and there are no duplicates with respect to the userId field.
I hope this helpes. If you have any query, feel free to ask.
As far as I know MongoDB now (from v 4.2) allows to use aggregation pipelines for updates.
More or less elegant way to make it work (according to the question) looks like the following:
db.runCommand({
update: "your-collection-name",
updates: [
{
q: {},
u: {
$set: {
"pfms.$[elem]": {
"n":"apples",
"mState": NumberInt(1111234)
}
}
},
arrayFilters: [
{
"elem.n": {
$eq: "apples"
}
}
],
multi: true
}
]
})
In my scenario, The data need to be init when not existed, and update the field If existed, and the data will not be deleted. If the datas have these states, you might want to try the following method.
// Mongoose, but mostly same as mongodb
// Update the tag to user, If there existed one.
const user = await UserModel.findOneAndUpdate(
{
user: userId,
'tags.name': tag_name,
},
{
$set: {
'tags.$.description': tag_description,
},
}
)
.lean()
.exec();
// Add a default tag to user
if (user == null) {
await UserModel.findOneAndUpdate(
{
user: userId,
},
{
$push: {
tags: new Tag({
name: tag_name,
description: tag_description,
}),
},
}
);
}
This is the most clean and fast method in the scenario.
As a business analyst , I had the same problem and hopefully I have a solution to this after hours of investigation.
// The customer document:
{
"id" : "1212",
"customerCodes" : [
{
"code" : "I"
},
{
"code" : "YK"
}
]
}
// The problem : I want to insert dateField "01.01.2016" to customer documents where customerCodes subdocument has a document with code "YK" but does not have dateField. The final document must be as follows :
{
"id" : "1212",
"customerCodes" : [
{
"code" : "I"
},
{
"code" : "YK" ,
"dateField" : "01.01.2016"
}
]
}
// The solution : the solution code is in three steps :
// PART 1 - Find the customers with customerCodes "YK" but without dateField
// PART 2 - Find the index of the subdocument with "YK" in customerCodes list.
// PART 3 - Insert the value into the document
// Here is the code
// PART 1
var myCursor = db.customers.find({ customerCodes:{$elemMatch:{code:"YK", dateField:{ $exists:false} }}});
// PART 2
myCursor.forEach(function(customer){
if(customer.customerCodes != null )
{
var size = customer.customerCodes.length;
if( size > 0 )
{
var iFoundTheIndexOfSubDocument= -1;
var index = 0;
customer.customerCodes.forEach( function(clazz)
{
if( clazz.code == "YK" && clazz.changeDate == null )
{
iFoundTheIndexOfSubDocument = index;
}
index++;
})
// PART 3
// What happens here is : If i found the indice of the
// "YK" subdocument, I create "updates" document which
// corresponds to the new data to be inserted`
//
if( iFoundTheIndexOfSubDocument != -1 )
{
var toSet = "customerCodes."+ iFoundTheIndexOfSubDocument +".dateField";
var updates = {};
updates[toSet] = "01.01.2016";
db.customers.update({ "id" : customer.id } , { $set: updates });
// This statement is actually interpreted like this :
// db.customers.update({ "id" : "1212" } ,{ $set: customerCodes.0.dateField : "01.01.2016" });
}
}
}
});
Have a nice day !

Pull and addtoset at the same time with mongo

I have a collection which elements can be simplified to this:
{tags : [1, 5, 8]}
where there would be at least one element in array and all of them should be different. I want to substitute one tag for another and I thought that there would not be a problem. So I came up with the following query:
db.colll.update({
tags : 1
},{
$pull: { tags: 1 },
$addToSet: { tags: 2 }
}, {
multi: true
})
Cool, so it will find all elements which has a tag that I do not need (1), remove it and add another (2) if it is not there already. The problem is that I get an error:
"Cannot update 'tags' and 'tags' at the same time"
Which basically means that I can not do pull and addtoset at the same time. Is there any other way I can do this?
Of course I can memorize all the IDs of the elements and then remove tag and add in separate queries, but this does not sound nice.
The error is pretty much what it means as you cannot act on two things of the same "path" in the same update operation. The two operators you are using do not process sequentially as you might think they do.
You can do this with as "sequential" as you can possibly get with the "bulk" operations API or other form of "bulk" update though. Within reason of course, and also in reverse:
var bulk = db.coll.initializeOrderedBulkOp();
bulk.find({ "tags": 1 }).updateOne({ "$addToSet": { "tags": 2 } });
bulk.find({ "tags": 1 }).updateOne({ "$pull": { "tags": 1 } });
bulk.execute();
Not a guarantee that nothing else will try to modify,but it is as close as you will currently get.
Also see the raw "update" command with multiple documents.
If you're removing and adding at the same time, you may be modeling a 'map', instead of a 'set'. If so, an object may be less work than an array.
Instead of data as an array:
{ _id: 'myobjectwithdata',
data: [{ id: 'data1', important: 'stuff'},
{ id: 'data2', important: 'more'}]
}
Use data as an object:
{ _id: 'myobjectwithdata',
data: { data1: { important: 'stuff'},
data2: { important: 'more'} }
}
The one-command update is then:
db.coll.update(
'myobjectwithdata',
{ $set: { 'data.data1': { important: 'treasure' } }
);
Hard brain working for this answer done here and here.
Starting in Mongo 4.4, the $function aggregation operator allows applying a custom javascript function to implement behaviour not supported by the MongoDB Query Language.
And coupled with improvements made to db.collection.update() in Mongo 4.2 that can accept an aggregation pipeline, allowing the update of a field based on its own value,
We can manipulate and update an array in ways the language doesn't easily permit:
// { "tags" : [ 1, 5, 8 ] }
db.collection.updateMany(
{ tags: 1 },
[{ $set:
{ "tags":
{ $function: {
body: function(tags) { tags.push(2); return tags.filter(x => x != 1); },
args: ["$tags"],
lang: "js"
}}
}
}]
)
// { "tags" : [ 5, 8, 2 ] }
$function takes 3 parameters:
body, which is the function to apply, whose parameter is the array to modify. The function here simply consists in pushing 2 to the array and filtering out 1.
args, which contains the fields from the record that the body function takes as parameter. In our case, "$tag".
lang, which is the language in which the body function is written. Only js is currently available.
In case you need replace one value in an array to another check this answer:
Replace array value using arrayFilters

MongoDB: Too many positional (i.e. '$') elements found in path

I just upgraded to Mongo 2.6.1 and one update statement that was working before is not returning an error. The update statement is:
db.post.update( { 'answers.comments.name': 'jeff' },
{ '$set': {
'answers.$.comments.$.name': 'joe'
}},
{ multi: true }
)
The error I get is:
WriteResult({
"nMatched" : 0,
"nUpserted" : 0,
"nModified" : 0,
"writeError" : {
"code" : 2,
"errmsg" : "Too many positional (i.e. '$') elements found in path 'answers.$.comments.$.createUsername'"
}
})
When I update an element just one level deep instead of two (i.e. answers.$.name instead of answers.$.comments.$.name), it works fine. If I downgrade my mongo instance below 2.6, it also works fine.
You CAN do this, you just need Mongo 3.6! Instead of redesigning your database, you could use the Array Filters feature in Mongo 3.6, which can be found here:
https://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-36-array-filters
The beauty of this is that you can bind all matches in an array to a variable, and then reference that variable later. Here is the prime example from the link above:
Use arrayFilters.
MongoDB 3.5.12 extends all update modifiers to apply to all array
elements or all array elements that match a predicate, specified in a
new update option arrayFilters. This syntax also supports nested array
elements.
Let us assume a scenario-
"access": {
"projects": [{
"projectId": ObjectId(...),
"milestones": [{
"milestoneId": ObjectId(...),
"pulses": [{
"pulseId": ObjectId(...)
}]
}]
}]
}
Now if you want to add a pulse to a milestone which exists inside a project
db.users.updateOne({
"_id": ObjectId(userId)
}, {
"$push": {
"access.projects.$[i].milestones.$[j].pulses": ObjectId(pulseId)
}
}, {
arrayFilters: [{
"i.projectId": ObjectId(projectId)
}, {
"j.milestoneId": ObjectId(milestoneId)
}]
})
For PyMongo, use arrayFilters like this-
db.users.update_one({
"_id": ObjectId(userId)
}, {
"$push": {
"access.projects.$[i].milestones.$[j].pulses": ObjectId(pulseId)
}
}, array_filters = [{
"i.projectId": ObjectId(projectId)
}, {
"j.milestoneId": ObjectId(milestoneId)
}])
Also,
Each array filter must be a predicate over a document with a single
field name. Each array filter must be used in the update expression,
and each array filter identifier $[] must have a corresponding
array filter. must begin with a lowercase letter and not contain
any special characters. There must not be two array filters with the
same field name.
https://jira.mongodb.org/browse/SERVER-831
The positional operator can be used only once in a query. This is a limitation, there is an open ticket for improvement: https://jira.mongodb.org/browse/SERVER-831
As mentioned; more than one positional elements not supported for now. You may update with mongodb cursor.forEach() method.
db.post
.find({"answers.comments.name": "jeff"})
.forEach(function(post) {
if (post.answers) {
post.answers.forEach(function(answer) {
if (answer.comments) {
answer.comments.forEach(function(comment) {
if (comment.name === "jeff") {
comment.name = "joe";
}
});
}
});
db.post.save(post);
}
});
db.post.update(
{ 'answers.comments.name': 'jeff' },
{ '$set': {
'answers.$[i].comments.$.name': 'joe'
}},
{arrayFilters: [ { "i.comments.name": { $eq: 'jeff' } } ]}
)
check path after answers for get key path right
I have faced the same issue for the as array inside Array update require much performance impact. So, mongo db doest not support it. Redesign your database as shown in the given link below.
https://pythonolyk.wordpress.com/2016/01/17/mongodb-update-nested-array-using-positional-operator/
db.post.update( { 'answers.comments.name': 'jeff' },
{ '$set': {
'answers.$.comments.$.name': 'joe'
}},
{ multi: true }
)
Answer is
db.post.update( { 'answers.comments.name': 'jeff' },
{ '$set': {
'answers.0.comments.1.name': 'joe'
}},
{ multi: true }
)