I have the following document
{store : {
id : 'STORE1',
name : 'Electronics Store',
locations : [
{
id : 'LOC1',
quantity : 4
},
{
id : 'LOC2',
quantity : 10
},
{
id : 'LOC3',
quantity : 5
}
]
}
}
I want to update the quantity of multiple elements of the locations array based on their id field using $inc.
For example I want to update id : 'LOC1' with +5 to the quantity field and id : LOC3 with +2 to its quantity field.
Is it possible to do this with one query in mongodb instead of using multiple queries for each location.
You can make use of filtered positional operator $[<identifier>]. Below code will be helpful:
db.collection.update(
{},
{$inc: {'store.locations.$[elem1].quantity': 5, 'store.locations.$[elem2].quantity': 2}},
{arrayFilters: [{'elem1.id': 'LOC1'}, {'elem2.id': 'LOC3'}]}
)
Related
I have mongo document like this:
{
"pw" : [
{
"id" : 123,
"qty" : 10
},
{
"id" : 456,
"qty" : 15
}
]
}
Id and qty is Number type in Atlas Search.
I want to search like "id = 123" and qty > 5.
I USED EQUALS and RANGE operator for this, but it is not working. How can I set criteria for specific document level (not in array level)?
Can we make query which enable us to use skip and limit in first N documents?
Ex :
Suppose there are nearly 500 documents in a collection called teachers.
I want a query which restrict me to read first 300 documents only.
If I use skip(300) in that query it should display null.
db.teachers.find().pretty()
{
id : 1,
name : "teach001"
},
{
id : 2,
name : "teach002"
},
{
id : 3,
name : "teach003"
},
{
id : 4,
name : "teach004"
},
{
id : 5,
name : "teach005"
},
{
id : 6,
name : "teach006"
},
{
id : 7,
name : "teach007"
},
{
id : 8,
name : "teach008"
},
{
id : 9,
name : "teach009"
},
{
id : 10,
name : "teach0010"
}
db.teachers.find({some query here to restrict access first 5 documents only }).skip(5).limit(5).pretty()
Aggregation
I don't think there is a way to do exactly what you are requesting. If you are open to using the aggregation framework, it can be accomplished easily.
db.teachers.aggregate([{$limit: 5}, {$skip: 5}])
View
If you are open to creating a view, you could even create a view that enforces the limit
db.createView('limitedTeachers', 'teachers', [{$limit: 5}])
Then you can use find on the view:
db.limitedTeachers.find({}).skip(5)
Two finds
If you cannot use aggregation, you have the option of using 2 finds in your query. First, find the n number of Object IDs. Then, limit your second query to only those Object IDs.
var ids = [];
db.teachers.find({}, {_id: 1}).limit(5).forEach(function(doc){
ids.push(doc._id);
});
db.teachers.find({ _id: {$in: ids} }).skip(5)
Or as the same type of query but closer to the format you had in your question
db.teachers.find({$or: db.teachers.find({}, {_id: 1}).limit(5).toArray()}).skip(5)
I'm wondering if it is possible with just one operation (or just one command) to update a document inside mongodb if the value used in the update doesn't exists in an array.
example mongodb document:
{
regs : {
someid : 12345,
dataArray : [ { id : 1 }, { id : 43 }, { id : 11 }]
}
}
Now I want only to update if the id inside dataArray is not in use already, something like:
db.regs.update({ someid : 12345 }, { $push : { dataArray : { id : INT }}})
Using the above line it's possible to check if { id : INT } is alreay in my array and update only if it isn't?
In a couple of ways. For example you can use query matching document of interest:
db.regs.update(
{someid : 12345, 'dataArray.id': {$ne: INT}},
{$push : { dataArray : {id : INT }}}
)
or perform update using addToSet:
db.regs.update(
{someid : 12345},
{$addToSet : {dataArray : {id : INT }}}
)
As #zero323 has already pointed out, there is an specific update operation with that specific use case in mind. From the MongoDB documentation:
$addToSet
The $addToSet operator adds a value to an array only if the value is
not in the array already. If the value is in the array, $addToSet
returns without modifying the array.
Consider the following example:
db.collection.update( { field: value }, { $addToSet: { field: value1 } } );
Here, $addToSet appends value1 to the array stored in field,
only if value1 is not already a member of this array.
I have a document of the sorts:
{ _id:ObjectID, list:Array }
And the list contains elements of the form (which I will refer to as listElement):
{ _id:ObjectID, time:Number }
I want to update the time subfield of 2 specific listElements each with its own distinct value. I have access to both _id of the listElements.
A related thing about this issue: would it be better to transform list from Array to an Object who's keys are the _id values? so I would do db.update( ( _id:"Document id" }, { "list.423rfasf2q3.time":200, "list.fjsdhfksjdh2432.time":100 } ) ?
I am unsure how one could use an ObjectID as a key, but I guess I can just string it and have both a _id value and the key containing that listElement be the same string.
You can't update two docs with two different values, you'll need to run it as two separate updates.
You do not want to put data into your key values, it's better to leave them as they are. There's no good way to match key values outside of $exists and even that does not support wildcards. So you'd have to get the ObjectId, convert it to a string, concatenate it into the key, then query for documents where that key exists. It's much easier to just put the value in a document in an array. Additionally, there's no way to index key values in MongoDB, so if you want to index your data you'll need to have it in the array as a data element.
Edit to include positional match example
First, insert two documents
> db.test.insert( {list: [ {_id: new ObjectId(), time: 1234}, {_id: new ObjectId(), time: 4556} ] } )
>
Demonstrate that they are both there
> db.test.find()
{ "_id" : ObjectId("5215036749177daf439a2ffe"), "list" : [{ "_id" : ObjectId("5215036749177daf439a2ffc"), "time" : 1234 }, { "_id" :
ObjectId("5215036749177daf439a2ffd"), "time" : 4556 } ] }
>
Show that if we do a normal array filter we get all elements of the array
> db.test.find({"list._id":ObjectId("5215036749177daf439a2ffc")})
{ "_id" : ObjectId("5215036749177daf439a2ffe"), "list" : [ { "_id" : ObjectId("5215036749177daf439a2ffc"), "time" : 1234 }, { "_id" :
ObjectId("5215036749177daf439a2ffd"), "time" : 4556 } ] }
>
Demonstrate that we can use the $ operator to only return the first array element that was matched
> db.test.find({"list._id":ObjectId("5215036749177daf439a2ffc")}, {_id: 0, "list.$": 1})
{ "list" : [ { "_id" : ObjectId("5215036749177daf439a2ffc"), "time" : 1234 } ] }
Using the aggregate framework, what is the best way to get documents with a maximum value of a field per grouping so using the collection below I would like to have functionality to return one document for each group_id having the latest date. The second listing shows the desired result.
group_id date
1 11/1/12
1 11/2/12
1 11/3/12
2 11/1/12
3 11/2/12
3 11/3/12
DESIRED RESULT
group_id date
1 11/3/12
2 11/1/12
3 11/3/12
You can use the $max grouping function in the Aggregation Framework to find the latest document for each group_id. You will need additional queries to retrieve the full documents based on the grouped criteria.
var results = new Array();
db.groups.aggregate(
// Find documents with latest date for each group_id
{ $group: {
_id: '$group_id',
date: { $max: '$date' },
}},
// Rename _id to group_id, so can use as find criteria
{ $project: {
_id: 0,
group_id:'$_id',
date: '$date'
}}
).result.forEach(function(match) {
// Find matching documents per group and push onto results array
results.push(db.groups.findOne(match));
});
Example results:
{
"_id" : ObjectId("5096cfb8c24a6fd1a8b68551"),
"group_id" : 1,
"date" : ISODate("2012-11-03T00:00:00Z"),
"foo" : "bar"
}
{
"_id" : ObjectId("5096cfccc24a6fd1a8b68552"),
"group_id" : 2,
"date" : ISODate("2012-11-01T00:00:00Z"),
"foo" : "baz"
}
{
"_id" : ObjectId("5096cfddc24a6fd1a8b68553"),
"group_id" : 3,
"date" : ISODate("2012-11-03T00:00:00Z"),
"foo" : "bat"
}