Update fields of array subdocuments with one request - mongodb

I have the following document in a collection:
{
_id: 'test',
values: [
{ foo: 1, bar: [<very big array>] },
{ foo: 4, bar: [<very big array>] },
{ foo: 3, bar: [<very big array>] }
]
}
I want to update all values[].foo values at once with a pre-calculated array. For performances reasons, I don't want to read the values[].bar arrays and since values can contains many elements, I'm searching a way to do it with only one request (if possible).
For example, I want to write something like this:
db.collection.updateOne({ _id: 'test' }, { $set: { 'values[].foo': [2, 3, 4] }});
And the result would be the following:
{
_id: 'test',
values: [
{ foo: 2, bar: [<very big array>] },
{ foo: 3, bar: [<very big array>] },
{ foo: 4, bar: [<very big array>] }
]
}
But I don't know how i must write my update request.
I'm using MongoDB 4.0 and i don't have access to 4.2 features.

Starting from v4.2 you can benefit from updates with aggregation pipeline.
It gives you ability to calculate new array using $zip and $map:
db.collection.updateOne(
{_id: "test"},
[{$set: {
values: {
$map: {
input: {
$zip: {
inputs: [
"$values.bar",
[2,3,4]
]
}
},
as: "item",
in: {
foo: {
$arrayElemAt: [
"$$item",
1
]
},
bar: {
$arrayElemAt: [
"$$item",
0
]
}
}
}
}
}}]
)
Be sure size of values is the same as size of update array.

Related

how to use $addToSet and $inc value of the set item in MongoDB

I have a collection of documents with an embedded array:
{
name: 'doc1',
events: []
},
{
name: 'doc1',
events: [{eventName: 'e1', times: 10}, {eventName: 'e2', times: 1}]
}
How can I add an event to the embedded array and increment the 'times' value? I don't want to have a duplicate event name in the events array.
Update:
I changed my model to make queries simpler. instead of
events: [{eventName: 'e1', times: 10}, {eventName: 'e2', times: 1}]
the model is:
events: { e1: 10, e2: 1}
Now it's much easier to update documents with a simple query. However still curious if anyone comes up with a simple and readable query without changing the data model.
.events.updateOne({name: 'doc1'}, { $inc: {events.e1: 1}});
Try this one:
var newEvent = "e2"
db.collection.aggregate([
{
$set: {
events: {
$cond: {
if: { $in: [newEvent, "$events.eventName"] },
then: {
$map: {
input: "$events",
as: "event",
in: { $mergeObjects: ["$$event", { times: { $add: ["$$event.times", 1] } }] }
}
},
else: { $concatArrays: ["$events", [{ eventName: newEvent, "times": 0 }]] },
}
}
}
}
])

Combine $facet result with subdocument and then conditionally excluding document

Suppose after the $facet stage I have a result including two arrays: roomInfo and hotelInfo.
Which looks like this:
{
"roomInfo": [
{
_id: 'ab1',
booked: 3
},
{
_id: 'ab2',
booked: 1
}
],
"hotelInfo": [
{
name: 'Radison Blue',
roomDetails: [
{
_id: 'ab1',
roomCount: 5
},
{
_id: 'xy1',
roomCount: 5
}
]
},
{
name: 'Intercontinental',
roomDetails: [
{
_id: 'ab2',
roomCount: 5
}
]
}
]
};
Expected Result
I want an output like this:
[
{
name: 'Radison Blue',
roomDetails: [
{
_id: 'ab1',
roomCount: 5,
booked: 3
},
{
_id: 'xy1',
roomCount: 5,
booked: 0
}
]
},
{
name: 'Intercontinental',
roomDetails: [
{
_id: 'ab2',
roomCount: 5,
booked: 1
}
]
}
];
Basically, adding the booked property from roomInfo into the hotelInfo's roomDetails field after matching their ids.
Additionally, after getting the above output result I want to exclude those hotels on which all the rooms(not for a single room) have the value of fields roomCount and booked equal. I want to do this inside the aggregation pipeline stage as I will have to deal with $skip and $limit later on.
How to achieve these use cases?
Thanks!
Basically the approach will be iterating over the hotels and matching each room accordingly, here is a quick working code sample:
db.collection.aggregate([
{
$unwind: "$hotelInfo"
},
{
$project: {
name: "$hotelInfo.name",
"roomDetails": {
$filter: {
input: {
$map: {
input: "$hotelInfo.roomDetails",
as: "info",
in: {
"$mergeObjects": [
"$$info",
{
"$arrayElemAt": [
{
$filter: {
input: "$roomInfo",
as: "room",
cond: {
$eq: [
"$$room._id",
"$$info._id"
]
}
}
},
0
]
}
]
}
}
},
as: "proccessedInfo",
cond: {
$ne: [
"$$proccessedInfo.roomCount",
"$$proccessedInfo.booked"
]
}
}
}
}
}
])
Mongo Playground
With that said you mention you'd like to paginate calls in the future. the current approach does not seem scaleable, because these are "real data points" aka hotels it's fine if your scale is somewhat capped ( no more than several thousands hotels ). but if it's not I recommend you ask another question with the entire pipeline you have so we can adjust it to work better.

Mongo: Upsert to copy all static fields to new document atomically

I'm using the Bucket Pattern to limit documents' array size to maxBucketSize elements. Once a document's array elements is full (bucketSize = maxBucketSize), the next update will create a new document with a new array to hold more elements using upsert.
How would you copy the static fields (see below recordType and recordDesc) from the last full bucket with a single call?
Sample document:
// records collection
{
recordId: 12345, // non-unique index
recordType: "someType",
recordDesc: "Some record description.",
elements: [ { a: 1, b: 2 }, { a: 3, b: 4 } ]
bucketsize: 2,
}
Bucket implementation (copies only queried fields):
const maxBucketSize = 2;
db.collection('records').updateOne(
{
recordId: 12345,
bucketSize: { $lt: maxBucketSize } // false
},
{
$push: { elements: { a: 5, b: 6 } },
$inc: { bucketSize: 1 },
$setOnInsert: {
// Should be executed because bucketSize condition is false
// but fields `recordType` and `recordDesc` inaccessible as
// no documents matched and were not included in the query
}
},
{ upsert: true }
)
Possible solution
To make this work, I can always make two calls, findOne() to get static values and then updateOne() where I set fields with setOnInsert, but it's inefficient
How can I modify this as one call with an aggregate? Examining one (last added) document matching recordId (index), evaluate if array is full, and add new document.
Attempt:
// Evaluate last document added
db.collection('records').findOneAndUpdate(
{ recordId: 12345 },
{
$push: { elements: {
$cond: {
if: { $lt: [ '$bucketSize', maxBucketSize ] },
then: { a: 5, b: 6 }, else: null
}
}},
$inc: { bucketSize: {
$cond: {
if: { $lt: [ '$bucketSize', maxBucketSize ] },
then: 1, else: 0
}
}},
$setOnInsert: {
recordType: '$recordType',
recordDesc: '$recordDesc'
}
},
{
sort: { $natural: -1 }, // last document
upsert: true, // Bucket overflow
}
)
This comes back with:
MongoError: Cannot increment with non-numeric argument: { bucketSize: { $cond: { if: { $lt: [ "$bucketSize", 2 ] }, then: 1, else: 0 } }}

How to use '$let' in MongoDB Aggregation Query in Scala?

I am trying to write a mongoDB aggregation query in Scala.
How do I write Scala code to use "$let" in '$project' stage?
I am wondering if Variable should be used. Not sure how?
'$project': {
'myprojitem' :{
'$let': {
'vars' : { 'myVariable1': { '$or': [...] } }
'in' : {
'$cond': [
'$$myVariable1',
{ ... },
{ ... },
]
}
}
I figured out the answer. Hopefully it helps someone.
val doc : Document = Document("{
'$let': {
'vars' : { 'myVariable1': { '$or': [...] } },
'in' : { '$cond': ['$$myVariable1',{ ... },{ ... } ]
}
}")
var pipeline = mutable.Buffer[Bson]()
pipeline += Aggregates.project(Projections.fields(
Projections.computed("myprojitem",doc)
))
Basically, every { name : expression } can be written as :
Document("name" -> expression)
Or
Document( "{name : expression}")
$let is used to bind variables together to a results obj. The syntax follows the rule:
{
$let:
{
vars: { <var1>: <expression>},
in: <expression>
}
}
for mere details you should take a look at $let (aggregation) definition from mongodb manual
Here is a text book example just to make more sense:
Consider the following data:
{ _id: 1, price: 10, tax: 0.50, applyDiscount: true }
{ _id: 2, price: 10, tax: 0.25, applyDiscount: false }
And imagine that we want to generate a result for the finalTotal in a way that:
Where Disc = 10% if applyDiscount: true and 0 otherwise.
So we need now to create the aggregation on the data to construct this equation. So we can get a results like:
{ _id: 1, finalTotal: 9.45 }
{ _id: 2, finalTotal: 10.25 }
We can do this by doing:
$project: {
finalTotal: {
$let: {
vars: {
total: { $add: [ '$price', '$tax' ] },
discounted: { $cond: { if: '$applyDiscount', then: (0.9, else: 1 } }
},
in: { $multiply: [ "$$total", "$$discounted" ] }
}
}
}
We can break this down:
Step 1. adding price to tax together to a variable called total
total: { $add: [ '$price', '$tax' ] },
Step 2. transforming the condition in numbers (variable discounted)
discounted: { $cond: { if: '$applyDiscount', then: 0.9, else: 1 } }
Step 3. performing the operation $multiply operation between the constructed $$total and $$discounted
in: { $multiply: [ "$$total", "$$discounted" ] }

MongoDB: match non-empty doc in array

I have a collection structured thusly:
{
_id: 1,
score: [
{
foo: 'a',
bar: 0,
user: {user1: 0, user2: 7}
}
]
}
I need to find all documents that have at least one 'score' (element in score array) that has a certain value of 'bar' and a non-empty 'user' sub-document.
This is what I came up with (and it seemed like it should work):
db.col.find({score: {"$elemMatch": {bar:0, user: {"$not":{}} }}})
But, I get this error:
error: { "$err" : "$not cannot be empty", "code" : 13030 }
Any other way to do this?
Figured it out: { 'score.user': { "$gt": {} } } will match non-empty docs.
I'm not sure I quite understand your schema, but perhaps the most straight forward way would be to not have an "empty" value for score.user ?
Instead purposely not have that field in your document if it has no content?
Then your query could be something like ...
> db.test.find({ "score" : { "$elemMatch" : { bar : 0, "user" : {"$exists": true }}}})
i.e. looking for a value in score.bar that you want (0 in this case) checking for the mear existence ($exists, see docs) of score.user (and if it has a value, then it'll exist?)
editied: oops I missed the $elemMatch you had ...
You probably want to add an auxiliary array that keeps track of the users in the user document:
{
_id: 1,
score: [
{
foo: 'a',
bar: 0,
users: ["user1", "user2"],
user: {user1: 0, user2: 7}
}
]
}
Then you can add new users atomically:
> db.test.update({_id: 1, score: { $elemMatch: {bar: 0}}},
... {$set: {'score.$.user.user3': 10}, $addToSet: {'score.$.users': "user3"}})
Remove users:
> db.test.update({_id: 1, score: { $elemMatch: {bar: 0}}},
... {$unset: {'score.$.user.user3': 1}, $pop: {'score.$.users': "user3"}})
Query scores:
> db.test.find({_id: 1, score: {$elemMatch: {bar: 0, users: {$not: {$size: 0}}}}})
If you know you'll only be adding non-existent users and removing existent users from the user document, you can simplify users to a counter instead of an array, but the above is more resilient.
Look at the $size operator for checking array sizes.
$group: {
_id: '$_id',
tasks: {
$addToSet: {
$cond: {
if: {
$eq: [
{
$ifNull: ['$tasks.id', ''],
},
'',
],
},
then: '$$REMOVE',
else: {
id: '$tasks.id',
description: '$tasks.description',
assignee: {
$cond: {
if: {
$eq: [
{
$ifNull: ['$tasks.assignee._id', ''],
},
'',
],
},
then: undefined,
else: {
id: '$tasks.assignee._id',
name: '$tasks.assignee.name',
},
},
},
},
},
},
},