I have some users and orders made by them:
db={
orders: [
{
"_id": "wJNEiSYwBd5ozGtLX",
"orderId": 52713,
"retailerId": 1320,
"createdAt": ISODate("2020-01-31T04:34:13.790Z"),
"status": "closed"
},
{
"_id": "wJNEiSYwBd5ozGtLX2",
"orderId": 52714,
"retailerId": 1320,
"createdAt": ISODate("2021-03-31T04:34:13.790Z"),
"status": "closed"
}
],
users: [
{
"_id": "2gSznevqwkGTxLRvL",
"createdAt": ISODate("2018-04-10T08:33:13.455Z"),
"username": "retailer#gmail.com",
"info": {
"retailerId": 1320,
},
"settings": {},
"status": "active",
}
]
}
If I try to aggregate orders into users:
db.users.aggregate([
{
"$lookup": {
"from": "orders",
"localField": "info.retailerId",
"foreignField": "retailerId",
"as": "orders"
}
},
])
I can get all orders merged into users like this:
[
{
"_id": "2gSznevqwkGTxLRvL",
"createdAt": ISODate("2018-04-10T08:33:13.455Z"),
"info": {
"retailerId": 1320
},
"orders": [
{
"_id": "wJNEiSYwBd5ozGtLX",
"createdAt": ISODate("2020-01-31T04:34:13.79Z"),
"orderId": 52713,
"retailerId": 1320,
"status": "closed"
},
{
"_id": "wJNEiSYwBd5ozGtLX2",
"createdAt": ISODate("2021-03-31T04:34:13.79Z"),
"orderId": 52714,
"retailerId": 1320,
"status": "closed"
}
],
"settings": {},
"status": "active",
"username": "retailer#gmail.com"
}
]
But I want to only merge orders from last month, not all orders into users.
How can I specify the date range?
https://mongoplayground.net/p/mZlDHQf2thN
Demo - https://mongoplayground.net/p/-hjGipqQPJq
https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/#join-conditions-and-uncorrelated-sub-queries
To perform uncorrelated subqueries between two collections as well as allow other join conditions besides a single equality match, the $lookup stage has the following syntax:
Use $lookup in the pipeline style -
db.users.aggregate([
{
"$lookup": {
"from": "orders",
"let": { "rId": "$info.retailerId" },
"pipeline": [
{
$match: {
$expr: { $eq: [ "$retailerId","$$rId" ] },
createdAt: { $gte: ISODate("2021-03-01T00:00:00.000Z"), $lt: ISODate("2021-04-01T00:00:00.000Z") } // write date range query here
}
}
],
"as": "orders"
}
}
])
Related
Greetings amigo i have one question related joining multiple collection in MongoDb
i have collection schema something like below
Posts Collection
{
"type": "POST_TYPE",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": "63241dffb0f6770c23663230",
"likes": 50
}
Post Types: 1. Event
{
"date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"venue": "Some Place",
"lat": "null",
"long": "null",
}
Post Types: 2. Poll
{
"created_date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"question": "Question??????",
"poll_opt1": "Yes",
"poll_opt2": "No",
"poll_opt1_count": "5",
"poll_opt2_count": "2"
}
now i have to join Post collection with respective collection e.g.
"post_id" to Event::_id or Poll::_id with condition to Post::type
i have tried aggregation but it does not gave expected output.
i am trying to get output something like below
[
{
"type": "event",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": {
"date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"venue": "Some Place",
"lat": "null",
"long": "null"
},
"likes": 50
},
{
"type": "poll",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": {
"created_date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"question": "Question??????",
"poll_opt1": "Yes",
"poll_opt2": "No",
"poll_opt1_count": "5",
"poll_opt2_count": "2"
},
"likes": 50
}
]
is there any efficient way to achieve this or better MongoDb schema to manage these types of records?
You can try something like this, using $facet:
db.posts.aggregate([
{
"$facet": {
"eventPosts": [
{
"$match": {
type: "event"
},
},
{
"$lookup": {
"from": "events",
"localField": "post_id",
"foreignField": "_id",
"as": "post_id"
}
}
],
"pollPosts": [
{
"$match": {
type: "poll"
},
},
{
"$lookup": {
"from": "poll",
"localField": "post_id",
"foreignField": "_id",
"as": "post_id"
}
}
]
}
},
{
"$addFields": {
"doc": {
"$concatArrays": [
"$pollPosts",
"$eventPosts"
]
}
}
},
{
"$unwind": "$doc"
},
{
"$replaceRoot": {
"newRoot": "$doc"
}
},
{
"$addFields": {
"post_id": {
"$cond": {
"if": {
"$eq": [
{
"$size": "$post_id"
},
0
]
},
"then": {},
"else": {
"$arrayElemAt": [
"$post_id",
0
]
}
}
}
}
}
])
We do the following, in the query:
Perform two $lookups for the different post_type within $facet. This unfortunately will increase, with the different values of post_type.
Then we combine all the arrays obtained from $facet, using $concatArray.
Then we unwind the concatenated array, and bring the nested document to the root using $replaceRoot.
Finally, for post_id we pick the first array element if it exists, to match the desired output.
Playground link.
I have the following collections which I am using a $lookup to bring together:
{
"organizations": [
{
"_id": 1,
"name": "foo",
"users": [1,2]
},
{
"_id": 2,
"name": "bar",
"users": [1]
}
],
"users": [
{
"_id": 1,
"name": "john1 smith"
},
{
"_id": 2,
"name": "bob johnson"
}
]
}
The query works fine:
[{
"$lookup": {
"from": "users",
"localField": "users",
"foreignField": "_id",
"as": "members"
}
},
{
"$unwind": "$members"
},
{
"$group": {
"_id": "$_id",
"original": { "$first": "$$ROOT" },
"members": {
"$push": "$members"
}
}
}
]
however, the resulting organizations records don't return all of their properties without adding the original prop which gives me back a nesting of the original organization:
{
"_id": 1,
"original": {
"_id": 1,
"name": "foo",
"users": [
1,
2
],
"members": {
"_id": 1,
"name": "john1 smith"
}
},
"members": [
{
"_id": 1,
"name": "john1 smith"
},
{
"_id": 2,
"name": "bob johnson"
}
]
}
I'm trying to get everything in the original prop back into the root along with the new members array.
I want to get the names of people who work in bmw and use these names as filter to find and return the documnents containing the field "car". So the end result must be the last two documents given in this example.
[
{"_id": "235", "name": "indu", "dob": "31/4/15", "company": "bmw"},
{"_id": "236", "name": "prith", "dob": "01/4/98", "company": "bmw"},
{"_id": "237", "name": "rames", "dob": "07/4/00", "company": "renault"},
{"_id": "238", "name": "indu", "salary": "10,000", "car": "yes", "married": "yes"},
{"_id": "239", "name": "prith", "salary": "80,000", "car": "yes", "children": "no"}
]
I appreciate your help, Thanks in advance
You want to use $exists
names = db.collection.distinct{"name", {"company": "bmw"})
db.collection.find({"car": {"$exists": true}, "name": {"$in": names}})
You can also do it in 1 aggregation call although I would not recommend it as It's less efficient.
db.collection.aggregate([
{
"$match": {
"company": "bmw"
}
},
{
$lookup: {
from: "this_collection",
localField: "name",
foreignField: "name",
as: "roots"
}
},
{
"$unwind": "$roots"
},
{
"$replaceRoot": {
"newRoot": "$roots"
}
},
{
"$match": {
"car": {"$exists": true}
}
}
])
Once I've unwound a sub-document array, how do I put it back together with all the original root fields?
Consider the following Tasks data set:
[
{
"_id": "5e95bb1cf36c0ab3247036bd",
"name": "Task A",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": "5e117e5cd90de7187b000d87"
},
{
"_id": "5e95bb30f36c0ab3247036be",
"name": "Task B1",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": "5e117e5cd90de7187b000d87",
"parent": "5e95bb1cf36c0ab3247036bd"
},
{
"_id": "5e95bb35f36c0ab3247036bf",
"name": "Task B2",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": "5e117e5cd90de7187b000d87",
"parent": "5e95bb1cf36c0ab3247036bd"
}
]
So, then I run $graphLookup to get the parent task and populate it's children and then $unwind it and populate the creator field:
[
{
"$match": {
"parent": {
"$exists": false
}
}
},
{
"$graphLookup": {
"from": "tasks",
"startWith": "$_id",
"connectFromField": "_id",
"connectToField": "parent",
"as": "children"
}
},
{
"$unwind": {
"path": "$children"
}
},
{
"$lookup": {
"from": "users",
"localField": "children.creator",
"foreignField": "_id",
"as": "children.creator"
}
},
{
"$unwind": {
"path": "$children.creator"
}
}
]
Which returns the following documents:
[
{
"_id": "5e95bb1cf36c0ab3247036bd",
"name": "Task A",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": "5e117e5cd90de7187b000d87",
"children": [
{
"_id": "5e95bb30f36c0ab3247036be",
"name": "Task B1",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": {
"name": "Jack Frost"
},
"parent": "5e95bb1cf36c0ab3247036bd"
}
]
},
{
"_id": "5e95bb1cf36c0ab3247036bd",
"name": "Task A",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": "5e117e5cd90de7187b000d87",
"children": [
{
"_id": "5e95bb35f36c0ab3247036bf",
"name": "Task B2",
"org": "5e95b9894a0aa0b30dfcbc0b",
"creator": {
"name": "Bill Nye"
},
"parent": "5e95bb1cf36c0ab3247036bd"
}
]
},
]
Lastly, I need to merge all of these duplicate documents back together and join the $children. This is the part I can't figure out. Below is some junk I'm trying but it seems messy to have to specifically list every property.
Is there a better way to combine multiple (mostly) matching docs?
[
...
{
"$group": {
"_id": "$_id",
"name": {
"$mergeObjects": "$properties"
},
"watchers": {
"$addToSet": "$watchers"
},
"assignees": {
"$addToSet": "$assignees"
},
"org": {
"$addToSet": "$$ROOT.org"
},
"children": {
"$push": "$children"
}
}
}
]
Answering my own question here, the best solution I can find is to specify each property but pass it the $first operator. This will ensure that the original value will be passed through.
{
$group: {
_id: '$_id',
name: {$first: '$name'},
org: {$first: '$org'},
creator: {$first: '$creator'},
children: {$push: '$children'}
}
}
I wanted to fetch data from 2 independent collections and sort the results based on date through a single query. Is that even possible in mongodb? I have collections:
OrderType1
{
"id": "1",
"name": "Hello1",
"date": "2016-09-23T15:07:38.000Z"
},
{
"id": "2",
"name": "Hello1",
"date": "2015-09-23T15:07:38.000Z"
}
OrderType2
{
"id": "3",
"name": "Hello3",
"date": "2012-09-23T15:07:38.000Z"
},
{
"id": "4",
"name": "Hello4",
"date": "2018-09-23T15:07:38.000Z"
}
Expected Result
[
{
"id": "3",
"name": "Hello3",
"date": "2012-09-23T15:07:38.000Z"
},
{
"id": "2",
"name": "Hello1",
"date": "2015-09-23T15:07:38.000Z"
},
{
"id": "1",
"name": "Hello1",
"date": "2016-09-23T15:07:38.000Z"
},
{
"id": "4",
"name": "Hello4",
"date": "2018-09-23T15:07:38.000Z"
}
]
Now, I want to fetch both types of orders in a single query sorted by date.
You can try below aggregation with mongodb 3.6 and above but I think you should use two queries because for the large data set $lookup pipeline will breach BSON limit of 16mb. But also It depends upon your $match condition or $limit. If they are applied to the $lookup pipeline then your aggregation would work perfectly.
db.OrderType1.aggregate([
{ "$limit": 1 },
{ "$facet": {
"collection1": [
{ "$limit": 1 },
{ "$lookup": {
"from": "OrderType1",
"pipeline": [{ "$match": { } }],
"as": "collection1"
}}
],
"collection2": [
{ "$limit": 1 },
{ "$lookup": {
"from": "OrderType2",
"pipeline": [{ "$match": { } }],
"as": "collection2"
}}
]
}},
{ "$project": {
"data": {
"$concatArrays": [
{ "$arrayElemAt": ["$collection1.collection1", 0] },
{ "$arrayElemAt": ["$collection2.collection2", 0] },
]
}
}},
{ "$unwind": "$data" },
{ "$replaceRoot": { "newRoot": "$data" } }
])