Mongodb - aggregation of subdocument and update with the result - mongodb

I have the following problem. I have found and summarized each value in a subdocument.
It gives the following [ { _id: 551fb140e4b04589d8997213, sumOfpeople: 342 } ]
I want to take the sumOfpeople and insert it to the same House( the same req.params.house_id)
House.aggregate([
{ $match: {
id: req.params.house_id
}},
{ $unwind: '$people' }, // unwind creates a doc for every array element
{ $group: {
_id: '$_id',
sumOfpeople: { $sum: '$people.nr'}
}}
], function (err, result) {
if (err) {
console.log(err);
return;
}
console.log(result);
});
This is the model that I want insert the result after the aggregation into.
module.exports = mongoose.model('House', {
id: String,
people: [{
id: String,
nr: Number
}],
sumOfpeople: Number //this is the field that I want to update after the aggregation
});
I have tried to use $set : {sumOfpeople: { $sum: '$people.nr'}}.
Is it possible to use $set inside an aggregation, or how can it be solved otherwise?

There's no way in MongoDB to write results directly into an existing document while doing an aggregation.
You've got 2 options:
retrieve the results in your application code, and then in a second query update the document.
use the $out operator, that will write the results of the aggregation into a new collection. This operation will delete all documents in the results collection and insert the new one. ( http://docs.mongodb.org/manual/reference/operator/aggregation/out/ )

Related

Mongoose updateMany based on result of previous query

I have two collections energyOffers and energyOfferLogs. When a user deactivated their account I'm looking for all the remaining active energyOffers where the entity of the user is in the assignees array, not in the declinedEntities array and the offerValidTill date is less than the current timestamp.
const [energyOffers] = await EnergyOffer.find([{
'assignees.id': entityID,
declinedEntities: {
$ne: leadID
},
offerValidTill: { $gt: Date.now() }
}], { session });
Based on these energyOffers I need to update the corresponding energyOfferLogs. I can find these with { entityID: entityID, 'offer.offerID': offer._id } but how can I look for all these offers in the same query?
If I loop through the energyOffers I will have to perform multiple updates while my guess is that this can be done in one updateMany. I was looking into the $lookup aggregate operator (https://www.mongodb.com/docs/v6.0/reference/operator/aggregation/lookup/) but it seems that the EnergyOffer find query is too complex to perform in this.
await EnergyOfferLog.updateMany({ ??? }, {
$set: {
'offer.action': 'declined',
'offer.action_date': Math.floor(Date.now()),
'offer.action_user': user.first_name,
'offer.action_user_id': userID
}
});
Get all offer ids from the first query, e.g.
let ids = energyOffers.map(o => o._id)
Use $in to match logs for all matching offers:
await EnergyOfferLog.updateMany({ entityID: entityID, 'offer.offerID': {$in: ids} }, {
$set: {
'offer.action': 'declined',
'offer.action_date': Math.floor(Date.now()),
'offer.action_user': user.first_name,
'offer.action_user_id': userID
}
});
If you want to do it with one query only, it is not complex. You can use $lookup with a pipeline for this:
Start with your $match query on the energyOffers collection
Use '$lookupto get the matchingenergyOfferLogs`
Clean the pipeline to contain only the energyOfferLogs docs
Perform the $set
Use $merge to save it back to energyOfferLogs collection
db.energyOffers.aggregate([
{$match: {
"assignees.id": entityID,
declinedEntities: {$ne: leadID},
offerValidTill: {$gt: Date.now()}
}
},
{$lookup: {
from: "energyOfferLogs",
let: {offerId: "$_id"},
pipeline: [
{$match: {
$and: [
{entityID: entityID},
{$expr: {$eq: ["$offer.offerID", "$$offerId"]}}
]
}
}
],
as: "energyOfferLogs"
}
},
{$unwind: "$energyOfferLogs"},
{$replaceRoot: {newRoot: "$energyOfferLogs"}},
{$set: {
"offer.action": "declined",
"offer.action_date": Math.floor(Date.now()),
"offer.action_user": user.first_name,
"offer.action_user_id": userID
}
},
{$merge: {into: "$energyOfferLogs"}}
])
See how it works on the playground example
Answer was updated according to a remark by #Alex_Blex

How to set one data field to another date field in the same object of the collection mongodb

I am trying to edit the fields of entries in a collection. I am checking if the lastUpdated date is less then published date. If it is, then the entry is probably faulty and I need to make the lastUpdated date same as published date. I have created the following mongo query for it :-
db.runCommand({ aggregate: "collectionNameHere",pipeline: [
{
$project: {
isFaulty: {$lt: ["$lastUpdated","$published"]}
}
},{
$match: {
isFaulty: true
}
},{
$addFields: {
lastUpdated: "$published"
}
}]
})
I am able to get the list of documents which have this fault, but I am not able to update the field. The last $addFields does not seem to be working. There is no error as well. Can someone help me with this or if they can provide me a better query fro my use case.
Thanks a lot.
You're doing a mistake by trying to update with aggreggation, what is not possible. You have to use update command to achieve your goal.
Cannot test it right now, but this should do the job :
db.collection.update({
$expr: {
$lt: [
"$lastUpdated",
"$published"
]
}
},
{$set:{lastUpdated:"$published"}}
)
It is not possible to update the document with the same field. You can use $out aggregation
db.collection.aggregate([
{ "$match": { "$expr": { "$lt": ["$lastUpdated", "$published"] }}},
{ "$addFields": { "lastUpdated": "$published" }}
])
here but it always creates a new collection as the "output" which is also not a solution here.
So, at last You have to use some iteration here to first find the document using find query and then need to update it. And with the async await now it quite easy to work this type of nested asynchronous looping.
const data = await db.collection
.find({ "$expr": { "$lt": ["$lastUpdated", "$published"] }})
.project({ lastUpdated: 1 })
.toArray()
await Promise.all(data.map(async(d) => {
await db.collection.updateOne({ _id: d._id }, { $set: { lastUpdated: d.published }})
}))

How to combine $inc and $sort operators in Mongoose

Is there a way to combine the $inc and $sort operators in Mongoose so that I can both increment a value in a nested array and sort that nested array in one operation?
I know it's possible to combine $push and $sort to push a value to a nested array and sort that nested array in one operation as such:
User.update({ _id: user },
{ $push:
{ friends:
{ $each: [...],
$sort: { challengeCount: -1 }
}
}
},
{ upsert: true }, callback);
Is there a way to do something similar when incrementing a nested value in an array? For example,
User.where({ _id: userId, "segments.id": segmentId })
.update({
$inc: { 'segments.$.count': 1 },
$sort: { 'segments.$.count': '-1' }
}, callback);
$sort is not being used correctly in the latter example, just trying to demonstrate my intentions.
Thanks!

How do I store the aggregation result into another database

I can store an aggregation result into another collection within the same database.
But how can I store the result into another database?
This is for copying a collection into another database:
use test1;
db["user_data"].find().forEach(
function(d){ db.getSiblingDB("test2")['user_data'].insert(d);
});
Aggregation function:
pipeline = [
{
'$group': {
'_id': {
'$year': '$birthday'
},
'count': {
'$sum': 1
}
}
}, {
'$sort': {
'_id': 1
}
}, {
'$out': output_collection
}
];
cur = db[source_collection].runCommand('aggregate', {
pipeline: pipeline,
allowDiskUse: true
});
After running the aggregation to the output collection, you need to run another command that clones the collection to another database using db.cloneCollection() as follows:
db.runCommand({
cloneCollection: "test.output_collection",
from: "mongodb.example.net:27017",
query: { active: true }
})
The above copies the output_collection collection from the test database on the server at mongodb.example.net. The operation only copies documents that satisfy the query { active: true } but the query arguments is optional. cloneCollection always copies indexes.
Starting Mongo 4.2, the new $merge aggregation operator can be used to write the result of an aggregation pipeline to the specified collection within another database:
db.collection.aggregate([
// { $group: { "_id": { $year: "$birthday" }, "count": { $sum: 1 } } },
// { $sort: { "_id": 1 } },
{ $merge: { into: { db: "to", coll: "collection" } } }
])
Note that if the targeted collection already contains records, the $merge operator comes with many options to specify how to merge inserted records conflicting with existing records.

$elemMatch and update

I would like to update a subdocument that was fetched using $elemMatch. I've found some posts online but so far I am not able to get it to work. This is what I have:
Schema:
var user = {
_id: ObjectId
addresses: [{
_id: ObjectId
street: String
}]
};
Code:
this.findOne({
'addresses._id': address_id
}, { 'occurrences': { $elemMatch: {
'_id': address_id
}}})
.exec(function(err, doc) {
if (doc) {
// Update the sub doc
doc.addresses[0].street = 'Blah';
doc.update({ 'addresses': { $elemMatch: { '_id': address_id }}}, { $set: {"addresses.$.street": doc.addresses[0].street }})
.exec(function(err, count) {
...
The above results in the address sub doc to be wiped out and a blank new one recreated. How can I save the doc/subdoc?
My goal is to be able to fetch a document (user) by subdocument (addresses) ID, modify that one matching address then save it.
You can do this all with a single update call on the model instead of fetching it first with findOne:
User.update(
{'addresses._id': address_id},
{$set: {'addresses.$.street': 'Blah'}},
function(err, count) { ... });
This uses the positional $ operator in the $set to target just the addresses element that was matched in the query.