Merge document to an array field using mongodb aggregation - mongodb

I have a collection like;
{
"_id": ObjectId("5f8069b848e54248f8302b18"),
"topics": [{
"_id": "123",
"name": "ABC",
"type": 0
}, {
"_id": "455",
"name": "DFR",
"type": 0
}
],
"topic": {
"_id": "777",
"name": "FFG",
"type": 123
}
}
Ho can I write an aggregation query to merge topic to topics?

Is topic always unique? (Ie: not already included in topics) If so you can just add it to the array.
https://mongoplayground.net/p/vKv_qjm3VNp
db.collection.aggregate([
{
"$project": {
topics: {
"$concatArrays": [
"$topics",
[
"$topic"
]
]
}
}
}
])

Related

Join multiple collections in MongoDB

Greetings amigo i have one question related joining multiple collection in MongoDb
i have collection schema something like below
Posts Collection
{
"type": "POST_TYPE",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": "63241dffb0f6770c23663230",
"likes": 50
}
Post Types: 1. Event
{
"date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"venue": "Some Place",
"lat": "null",
"long": "null",
}
Post Types: 2. Poll
{
"created_date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"question": "Question??????",
"poll_opt1": "Yes",
"poll_opt2": "No",
"poll_opt1_count": "5",
"poll_opt2_count": "2"
}
now i have to join Post collection with respective collection e.g.
"post_id" to Event::_id or Poll::_id with condition to Post::type
i have tried aggregation but it does not gave expected output.
i am trying to get output something like below
[
{
"type": "event",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": {
"date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"venue": "Some Place",
"lat": "null",
"long": "null"
},
"likes": 50
},
{
"type": "poll",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"post_id": {
"created_date": "2022-09-16T07:07:18.242+00:00",
"_id": "63241dffb0f6770c23663230",
"user_id": "63241dffb0f6770c23663230",
"question": "Question??????",
"poll_opt1": "Yes",
"poll_opt2": "No",
"poll_opt1_count": "5",
"poll_opt2_count": "2"
},
"likes": 50
}
]
is there any efficient way to achieve this or better MongoDb schema to manage these types of records?
You can try something like this, using $facet:
db.posts.aggregate([
{
"$facet": {
"eventPosts": [
{
"$match": {
type: "event"
},
},
{
"$lookup": {
"from": "events",
"localField": "post_id",
"foreignField": "_id",
"as": "post_id"
}
}
],
"pollPosts": [
{
"$match": {
type: "poll"
},
},
{
"$lookup": {
"from": "poll",
"localField": "post_id",
"foreignField": "_id",
"as": "post_id"
}
}
]
}
},
{
"$addFields": {
"doc": {
"$concatArrays": [
"$pollPosts",
"$eventPosts"
]
}
}
},
{
"$unwind": "$doc"
},
{
"$replaceRoot": {
"newRoot": "$doc"
}
},
{
"$addFields": {
"post_id": {
"$cond": {
"if": {
"$eq": [
{
"$size": "$post_id"
},
0
]
},
"then": {},
"else": {
"$arrayElemAt": [
"$post_id",
0
]
}
}
}
}
}
])
We do the following, in the query:
Perform two $lookups for the different post_type within $facet. This unfortunately will increase, with the different values of post_type.
Then we combine all the arrays obtained from $facet, using $concatArray.
Then we unwind the concatenated array, and bring the nested document to the root using $replaceRoot.
Finally, for post_id we pick the first array element if it exists, to match the desired output.
Playground link.

How to get result using stable sort using mongodb

I am working on making a query which can sort the result after grouping keys in MongoDB.
Following is the example data in DB
[
{
"_id": ObjectId("5a934e000102030405000000"),
"code": "code",
"groupId": "L0LV7ENT",
"version": {
"id": "1.0.0.0"
},
"status": "Done",
"type": "main"
},
{
"_id": ObjectId("5a934e000102030405000001"),
"code": "code",
"groupId": "L0LV7ENT",
"version": {
"id": "2.0.0.0"
},
"status": "Done",
"type": "main"
},
{
"_id": ObjectId("5a934e000102030405000002"),
"code": "code",
"groupId": "F6WJ9QP7",
"version": {
"id": "1.1.0.0"
},
"status": "Done",
"type": "main"
}
]
Here, I would like to sort the result in ascending order according to the version.id and to group the result according to the groupId.
Hence, I used the following query
db.collection.aggregate([
{
"$match": {
"$and": [
{
"type": "main",
"code": {
"$in": [
"code"
]
},
"status": {
"$in": [
"Done",
"Completed"
]
},
"groupId": {
"$in": [
"L0LV7ENT",
"F6WJ9QP7"
]
}
}
]
}
},
{
"$sort": {
"_id": 1,
"version.id": 1
}
},
{
"$group": {
"_id": {
"groupId": "$groupId"
},
"services": {
"$push": "$$ROOT"
}
}
}
])
But the result I am getting is not stable. Sometimes I see, the data with "_id": ObjectId("5a934e000102030405000002") coming first then ObjectId("5a934e000102030405000000") and ObjectId("5a934e000102030405000001").
It seems intermmitent. Is there any way to get a stable result?
EDIT
You can try it here
From the documentation:
$group does not order its output documents.
So you will need to sort after the group stage to have a deterministic output order.

mongodb distinct query values

I have the following mongodb documents:
{
"_id": "",
"name": "example1",
"colors": [
{
"id": 1000000,
"properties": [
{
"id": "1000",
"name": "",
"value": "green"
},
{
"id": "2000",
"name": "",
"value": "circle"
}
]
} ]
}
{
"_id": "",
"name": "example2",
"colors": [
{
"id": 1000000,
"properties": [
{
"id": "1000",
"name": "",
"value": "red"
},
{
"id": "4000",
"name": "",
"value": "box"
}
]
} ]
}
I would like to get distinct queries on the value field in the array where id=1000
db.getCollection('product').distinct('colors.properties.value', {'colors.properties.id':{'$eq': 1000}})
but it returns all values in the array.
The expected Result would be:
["green", "red"]
There are a lot of way to do.
$match eliminates unwanted data
$unwind de-structure the array
$addToSet in $group gives the distinct data
The mongo script :
db.collection.aggregate([
{
$match: {
"colors.properties.id": "1000"
}
},
{
"$unwind": "$colors"
},
{
"$unwind": "$colors.properties"
},
{
$match: {
"colors.properties.id": "1000"
}
},
{
$group: {
_id: null,
distinctData: {
$addToSet: "$colors.properties.value"
}
}
}
])
Working Mongo playground

Combining unique elements of arrays without $unwind

I would like to get the unique elements of all arrays in a collection. Consider the following collection
[
{
"collection": "collection",
"myArray": [
{
"name": "ABC",
"code": "AB"
},
{
"name": "DEF",
"code": "DE"
}
]
},
{
"collection": "collection",
"myArray": [
{
"name": "GHI",
"code": "GH"
},
{
"name": "DEF",
"code": "DE"
}
]
}
]
I can achieve this by using $unwind and $group like this:
db.collection.aggregate([
{
$unwind: "$myArray"
},
{
$group: {
_id: null,
data: {
$addToSet: "$myArray"
}
}
}
])
And get the output:
[
{
"_id": null,
"data": [
{
"code": "GH",
"name": "GHI"
},
{
"code": "DE",
"name": "DEF"
},
{
"code": "AB",
"name": "ABC"
}
]
}
]
However, the array "myArray" will have a lot of elements (about 6) and the number of documents passed into this stage of the pipeline will be about 600. So unwinding the array would give me a total of 3600 documents being processed. I would like to know if there's a way for me to achieve the same result without unwinding
You can use below aggregation
db.collection.aggregate([
{ "$group": {
"_id": null,
"data": { "$push": "$myArray" }
}},
{ "$project": {
"data": {
"$reduce": {
"input": "$data",
"initialValue": [],
"in": { "$setUnion": ["$$this", "$$value"] }
}
}
}}
])
Output
[
{
"_id": null,
"data": [
{
"code": "AB",
"name": "ABC"
},
{
"code": "DE",
"name": "DEF"
},
{
"code": "GH",
"name": "GHI"
}
]
}
]

Fetch data from 2 collections in mongodb in single query

I wanted to fetch data from 2 independent collections and sort the results based on date through a single query. Is that even possible in mongodb? I have collections:
OrderType1
{
"id": "1",
"name": "Hello1",
"date": "2016-09-23T15:07:38.000Z"
},
{
"id": "2",
"name": "Hello1",
"date": "2015-09-23T15:07:38.000Z"
}
OrderType2
{
"id": "3",
"name": "Hello3",
"date": "2012-09-23T15:07:38.000Z"
},
{
"id": "4",
"name": "Hello4",
"date": "2018-09-23T15:07:38.000Z"
}
Expected Result
[
{
"id": "3",
"name": "Hello3",
"date": "2012-09-23T15:07:38.000Z"
},
{
"id": "2",
"name": "Hello1",
"date": "2015-09-23T15:07:38.000Z"
},
{
"id": "1",
"name": "Hello1",
"date": "2016-09-23T15:07:38.000Z"
},
{
"id": "4",
"name": "Hello4",
"date": "2018-09-23T15:07:38.000Z"
}
]
Now, I want to fetch both types of orders in a single query sorted by date.
You can try below aggregation with mongodb 3.6 and above but I think you should use two queries because for the large data set $lookup pipeline will breach BSON limit of 16mb. But also It depends upon your $match condition or $limit. If they are applied to the $lookup pipeline then your aggregation would work perfectly.
db.OrderType1.aggregate([
{ "$limit": 1 },
{ "$facet": {
"collection1": [
{ "$limit": 1 },
{ "$lookup": {
"from": "OrderType1",
"pipeline": [{ "$match": { } }],
"as": "collection1"
}}
],
"collection2": [
{ "$limit": 1 },
{ "$lookup": {
"from": "OrderType2",
"pipeline": [{ "$match": { } }],
"as": "collection2"
}}
]
}},
{ "$project": {
"data": {
"$concatArrays": [
{ "$arrayElemAt": ["$collection1.collection1", 0] },
{ "$arrayElemAt": ["$collection2.collection2", 0] },
]
}
}},
{ "$unwind": "$data" },
{ "$replaceRoot": { "newRoot": "$data" } }
])