I'm trying to perform nested group, I have an array of documents that has two keys (invoiceIndex, proceduresIndex) I need the documents to be arranged like so
invoices (parent) -> procedures (children)
invoices: [ // Array of invoices
{
.....
"procedures": [{}, ...] // Array of procedures
}
]
Here is a sample document
{
"charges": 226.09000000000003,
"currentBalance": 226.09000000000003,
"insPortion": "",
"currentInsPortion": "",
"claim": "notSent",
"status": "unpaid",
"procedures": {
"providerId": "9vfpjSraHzQFNTtN7",
"procedure": "21111",
"description": "One surface",
"category": "basicRestoration",
"surface": [
"m"
],
"providerName": "B Dentist",
"proceduresIndex": "0"
},
"patientId": "mE5vKveFArqFHhKmE",
"patientName": "Silvia Waterman",
"invoiceIndex": "0",
"proceduresIndex": "0"
}
Here is what I have tried
https://mongoplayground.net/p/AEBGmA32n8P
Can you try the following;
db.collection.aggregate([
{
$group: {
_id: "$invoiceIndex",
procedures: {
$push: "$procedures"
},
invoice: {
$first: "$$ROOT"
}
}
},
{
$addFields: {
"invoice.procedures": "$procedures"
}
},
{
"$replaceRoot": {
"newRoot": "$invoice"
}
}
])
I retain the invoice fields with invoice: { $first: "$$ROOT" }, also keep procedures's $push logic as a separate field. Then with $addFields I move that array of procedures into the new invoice object. Then replace root to that.
You shouldn't use the procedureIndex as a part of _id in $group, for you won't be able to get a set of procedures, per invoiceIndex then. With my $group logic it works pretty well as you see.
Link to mongoplayground
Related
I'm trying to build a complex, nested aggregation pipeline in MongoDB (4.4.9 Community Edition, using the pymongo driver for Python 3.10).
There are relevant data points in different collections which I want to aggregate into one, NEW (ideally) view (or, if that doesn't work) collection.
The collections, and the relevant fields therein follow a hierarchy. There is members, which contains the top-level key on which other data is to be merged,
membershipNumber.
> members.find_one()
{'_id': ObjectId('61153299af6122XXXXXXXXXXXXX'), 'membershipNumber': 'N03XXXXXX'}
Then, there's a different collection, which contains membershipNumber, but also a different, linked field, an_user_id. an_user_id is used in other collections to denote records/fields in arrays that pertain to that particular user.
I 'join' members and an_users like so:
result = members.aggregate([
{
'$lookup': {
'from': 'an_users',
'localField': 'membershipNumber',
'foreignField': 'memref',
'as': 'an_users'
}
},
{ '$unwind' : '$an_users' },
{
'$project' : {
'_id' : 1,
'membershipNumber' : 1,
'an_user_id' : '$an_users.user_id'
}
}
]);
So far so good, this returns the desired, aggregated record:
{'_id': ObjectId('61153253aBBBBBBBBBBBB'),
'membershipNumber': 'N0XXXXXXXX',
'an_user_id': '48XXXXXX'}
Now, I have a third collection, which contains the an_user_id as a string in arrays, denoting wherever that user clicked a given email, whereby a record is an email (and the an_user_ids in the clicks array are users that clicked a link in that email.
{'_id': ObjectId('blah'),
'email_id': '407XXX',
'actions_count': 17,
'administrative_title': 'test',
'bounce': ['3440XXXX'],
'click': ['38294CCC',
'418FFFF',
'48XXXXXX',
'38eGGGG'}
I want to count the number occurences of a given an_user_id (which I've attained from aggregating) in arrays (e.g. clicks, bounces, opens) in the emails collection, and include it in the .aggregate call, to retrieve something like this:
{'_id': ObjectId('61153253aBBBBBBBBBBBB'),
'membershipNumber': 'N0XXXXXXXX',
'an_user_id': '48XXXXXX',
'n_email_clicks' : 412,
'n_email_bounces' : 12
}
Further, I might want to also attach counts of an_user_id in other collections in my DB.
Consider, e.g., this collection called events:
{
"_id": "617ffa96ee11844e143a63dd",
"id": "12345",
"administrative_title": "my_event",
"created_at": {
"$date": "2020-01-15T16:28:50.000Z"
},
"event_creator_id": "123456",
"event_title": "my_event",
"group_id": "123456",
"permalink": "event_id",
"rsvp_count": 54,
"rsvps": [{
"rsvp_id": "56789",
"display_name": "John Doe",
"rsvp_user_id": "48XXXXXX",
"rsvp_created_at": {
"$date": "2020-01-28T15:38:50.000Z"
},
"rsvp_updated_at": {
"$date": "2020-01-28T15:38:50.000Z"
},
"first_name": "John",
"last_name": "Doe",
}, {
"rsvp_id": "543895",
"display_name": "James Appleslice",
"rsvp_user_id": "N03XXXXXX",
"rsvp_created_at": {
"$date": "2020-02-05T13:15:14.000Z"
},
"rsvp_updated_at": {
"$date": "2020-02-05T13:15:14.000Z"
},
"first_name": "James",
"last_name": "Appleslice"}
]
}
So, the end-product would look something like this:
{'_id': ObjectId('61153253aBBBBBBBBBBBB'),
'membershipNumber': 'N0XXXXXXXX',
'an_user_id': '48XXXXXX',
'n_email_clicks' : 412,
'n_email_bounces' : 12,
'n_rsvps' : 12
}
My idea was to use the $lookup parameter -- however, I only know how to use this for matching on fields that I have in the parent collection that I'm performing the aggregation on, but not on fields that have been generated in the process of the aggregation.
Any help would be hugely appreciated!
You could use $lookup pipeline. First you would $lookup the user id followed by another $lookup to verify if the user id exists in email. Lastly few more stages to collect the results and format per your need. Furthermore, you can add $out stage if you would like to write the results into another collection.
db.members.aggregate([{
$lookup: {
from: "an_users",
let: {
membershipNumber: "$membershipNumber"
},
pipeline: [
{
$match: {
$expr: {
$eq: [
"$memref",
"$$membershipNumber"
]
},
}
},
{
"$lookup": {
"from": "emails",
"localField": "user_id",
"foreignField": "click",
"as": "clicks"
}
},
{
"$project": {
"_id": 1,
"membershipNumber": 1,
"an_user_id": "$user_id",
"n_email_clicks": {
$size: "$clicks"
}
}
}
],
as: "details"
}
},
{
$replaceRoot: {
newRoot: {
$mergeObjects: [
{
$arrayElemAt: [
"$details",
0
]
},
"$$ROOT"
]
}
}
},
{
$project: {
details: 0
}
}])
Working example - https://mongoplayground.net/p/yrFsNp44hpi
I have the following mongo db schema and I am trying to build an aggregate query that searches under github_open_issues under the repo key and can return me a match for all the values with repoA as the value. I have tried the following as my query however its not returning any result. Im a bit confused why this is not working as I have another db with a schema similar to this and this type of query works there but here something seems to be different and is not working. I have also put together this interactive example mongoplayground
query
db.collection.aggregate([
{
"$unwind": "$github_open_issues"
},
{
"$match": {
"github_open_issues.repo": {
"$in": [
"repoA"
]
}
}
},
])
schema
[
{
"github_open_issues": {
"0": {
"git_url": "https://github.com/",
"git_assignees": "None",
"git_open_date": "2019-09-26",
"git_id": 253113,
"repo": "repoA",
"git_user": "userA",
"state": "open"
},
"1": {
"git_url": "https://github.com/",
"git_assignees": "None",
"git_open_date": "2019-11-15",
"git_id": 294398,
"repo": "repoB",
"git_user": "userB",
"state": "open"
},
"2": {
"git_url": "https://github.com/",
"git_assignees": "None",
"git_open_date": "2021-04-12",
"git_id": 661208,
"repo": "repoA",
"state": "open"
}
},
"unique_label_seen": {
"568": {
"label_name": "some label",
"times_seen": 12,
"535": {
"label_name": "another label",
"times_seen": 1
}
}
}
}
]
$objectToArray convert github_open_issues object to array in key-value format
$filter to iterate loop of above converted array and filter your search condition
$match to filter github_open_issues not empty
$arrayToObject convert github_open_issues array to object
db.collection.aggregate([
{
$addFields: {
github_open_issues: {
$filter: {
input: { $objectToArray: "$github_open_issues" },
cond: { $in: ["$$this.v.repo", ["repoA"]] }
}
}
}
},
{ $match: { github_open_issues: { $ne: [] } } },
{ $addFields: { github_open_issues: { $arrayToObject: "$github_open_issues" } } }
])
Playground
You query is correct but you data in schema placed wrong inside github_open_issues.repo your objects are place by numbers like {"0": {values... }, "1":{values... }} which cannot get your desired value. You can check the playground now playground
Let's assume that this is how a sample document looks like in mongo-db,
[
{
"_id": "1",
"attrib_1": "value_1",
"attrib_2": "value_2",
"months": {
"2": {
"month": "2",
"year": "2008",
"transactions": [
{
"field_1": "val_1",
"field_2": "val_2",
},
{
"field_1": "val_4",
"field_2": "val_5",
"field_3": "val_6"
},
]
},
"3": {
"month": "3",
"year": "2018",
"transactions": [
{
"field_1": "val_7",
"field_3": "val_9"
},
{
"field_1": "val_10",
"field_2": "val_11",
},
]
},
}
}
]
The desired output is something like this, (I am just showing it for months 2 & 3)
id
months
year
field_1
field_2
field_3
1
2
2008
val_1
val_2
1
2
2008
val_4
val_5
val_6
1
3
2018
val_7
val_9
1
3
2018
val_10
val_11
My attempt:
I tried something like this in Py-Mongo,
pipeline = [
{
# some filter logic here to filter data basically first
},
{
"$addFields": {
"latest": {
"$map": {
"input": {
"$objectToArray": "$months",
},
"as": "obj",
"in": {
"all_field_1" : {"$ifNull" : ["$$obj.v.transactions.field_1", [""]]},
"all_field_2": {"$ifNull" : ["$$obj.v.transactions.field_2", [""]]},
"all_field_3": {"$ifNull" : ["$$obj.v.transactions.field_3", [""]]},
"all_months" : {"$ifNull" : ["$$obj.v.month", ""]},
"all_years" : {"$ifNull" : ["$$obj.v.year", ""]},
}
}
}
}
},
{
"$project": {
"_id": 1,
"months": "$latest.all_months",
"year": "$latest.all_years",
"field_1": "$latest.all_field_1",
"field_2": "$latest.all_field_2",
"field_3": "$latest.all_field_3",
}
}
]
# and I executed it as
my_db.collection.aggregate(pipeline, allowDiskUse=True)
The above is actually bring the data but it's bringing them in lists. Is there a way to easily bring them one each row in mongo itself?
the above brings data in this way,
id
months
year
field_1
field_2
field_3
1
["2", "3"]
["2008", "2018"]
[["val_1", "val_4"], ["val_7", "val_10"]]
[["val_2", "val_5"], ["", "val_11"]]
[["", "val_6"], ["val_9", ""]]
Would highly appreciate your valuable inputs regarding the same and a better way to do the same as well!
Thanks for your time.
My Mongo version is 3.4.6 and I am using PyMongo as my driver. You can see the query in action at mongo-db-playground
This is might be bad idea to do all process in a aggregation query, you could do this in your client side,
I have created a query which is lengthy may cause performance issues in huge data,
$objectToArray convert months object to array
$unwind deconstruct months array
$unwind deconstruct transactions array and provide index field index
$group by _id, year, month and index, and get first object from transactions in fields
$project you can design your response if you want otherwise this is optional i have added in playground link
my_db.collection.aggregate([
{ # some filter logic here to filter data basically first },
{ $project: { months: { $objectToArray: "$months" } } },
{ $unwind: "$months" },
{
$unwind: {
path: "$months.v.transactions",
includeArrayIndex: "index"
}
},
{
$group: {
_id: {
_id: "$_id",
year: "$months.v.year",
month: "$months.v.month",
index: "$index"
},
fields: { $first: "$months.v.transactions" }
}
}
], allowDiskUse=True);
Playground
When i do a find all in a mongodb collection, i want to get maintenanceList sorted by older maintenanceDate to newest.
The sort of maintenanceDate should not affect parents order in a find all query
{
"_id":"507f191e810c19729de860ea",
"color":"black",
"brand":"brandy",
"prixVenteUnitaire":200.5,
"maintenanceList":[
{
"cost":100.40,
"maintenanceDate":"2017-02-07T00:00:00.000+0000"
},
{
"cost":4000.40,
"maintenanceDate":"2019-08-07T00:00:00.000+0000"
},
{
"cost":300.80,
"maintenanceDate":"2018-08-07T00:00:00.000+0000"
}
]
}
Any guess how to do that ?
Thank you
Whatever order the fields are in with the previous pipeline stage, as operations like $project and $group effectively "copy" same position.So, it will not change the order of your fields in your aggregated result.
And the sort of maintenanceDate through aggregation will not affect parents order in a find all query.
So, simply doing this should work.
Assuming my collection name is example.
db.example.aggregate([
{
"$unwind": "$maintenanceList"
},
{
"$sort": {
"_id": 1,
"maintenanceList.maintenanceDate": 1
}
},
{
"$group": {
"_id": "$_id",
"color": {
$first: "$color"
},
"brand": {
$first: "$brand"
},
"prixVenteUnitaire": {
$first: "$prixVenteUnitaire"
},
"maintenanceList": {
"$push": "$maintenanceList"
}
}
}
])
Output:
I'm looking to move an array of subdocuments into a collection of it's own keyed by the owner id. Currently, my collection is formed like this:
"_id": ObjectId("123"),
"username": "Bob Dole",
"logins": [{
"_id": ObjectId("abc123"),
"date": ISODate("2016")
}, {
"_id": ObjectId("def456"),
"date": ISODate("2016")
}]
I'm looking for the best way to write a script that would loop over each user, and move each item in the logins array to it's own "logins" collection, as follows:
{
"_id": ObjectId("abc123"),
"_ownerId": ObjectId("123"),
"date": ISODate("2016")
}
{
"_id": ObjectId("def567"),
"_ownerId": ObjectId("123"),
"date": ISODate("2016")
}
When the script ends, I'd like the login array to be removed entirely from all users.
this query will create new collection using aggregation framework
to see how it looks - just remove $out pipeline phase
db.thinking.aggregate([
{
$unwind:"$logins"
},{
$project:{
_id:"$logins._id",
_ownerId:"$_id",
date:"$logins.date"
}
},
{
$out: "newCollection"
}
])
to delete array records - as suggested in comment:
db.thinking.update({},{ "$unset": { "logins": "" } },{ "multi": true })