I'm trying a condition if it's possible or not needed some answers if it's can be done.
Let's say I have a collection named : Resturant
Resturant
id
name
food
so and i have 4 rows :
1 , resturant1, burger
2 , resturant2, sandwich
3 , resturant2, burger
4 , resturant3, burger
So what i'm trying to achieve here is from a single query to fetch resturant1 & resturan2 values like
{"$match": {"$expr": {"$in": ["resturant1", "resturant2"]}
but if the food already exists in resturant1 then don't fetch that value from resturant2 so if burger already exists in resturant1 then do not fetch it from resturant2. so the result will be only 2 rows :
1 , resturant1, burger
2 , resturant2, sandwich
We can achieve this after fetching the result and overriding the already exists values but i was seeing if we can use any condition in mongodb query itself.
One option is using $setWindowFields (since mongodb version 5.0):
$match the relevant documents by name
Use $setWindowFields to temporarily add a set of all food types up to this document
$match only documents with food that is not in the document food.
Remove the added field
db.collection.aggregate([
{$match: {name: {$in: ["resturant1", "resturant2"]}}},
{$setWindowFields: {
sortBy: {name: 1},
output: {foodSet: {$addToSet: "$food", window: {documents: ["unbounded", -1]}}}
}},
{$match: {$expr: {$not: {$in: ["$food", "$foodSet"]}}}},
{$unset: "foodSet"}
])
See how it works on the playground example
I am working with 1.000 documets in a collection in MongoDb. Each document can be made of many topics, and a topic can be made of many keywords.
The mongo structure for each document is the following:
_id:ObjectId(6d5fc0922982bb550e08502d),
id_doc:"1234-678-436-42"
topic:Array
keywords:Array
The key topic is an array of objects o this type
type:"topic"
label:"work"
While, the keywords key is an array of objects, very similar to "topic":
type:"keyword"
value:"programmer"
label:"work"
Label represents in both cases the topic of the doc!
What I want is to list all the documents (id_doc) where a topic appears in the "topic" array, but never in the "keyword" array.
Query
takes the intersection of topic.label with keywords.label
if empty then no common members, so document passes the filter
*not sure if you want this, if not if you can give 1-2 documents in json text, and the expected output
Playmongo
aggregate(
[{"$match":
{"$expr":
{"$eq":
[{"$setIntersection": ["$topic.label", "$keywords.label"]}, []]}}}])
One option is using $filter:
count the number of items in topic which their label is not present in as a value of an item on keywords. Save this count as condMatch
$match only document with condMatch greater than 0
db.collection.aggregate([
{$set: {condMatch: {
$size: {
$filter: {
input: "$topic",
cond: {$not: {$in: ["$$this.label", "$keywords.value"]}}
}
}
}
}
},
{$match: {condMatch: {$gt: 0}}},
{$unset: "condMatch"}
])
See how it works on the playground example
I have a collection in below format
{customerID:1,acctDetails:[{accType:"Saving",balance:100},{accType:"checking",balance:500}]}
{customerID:2,acctDetails:[{accType:"Saving",balance:500}]}
I want to find total balance by acctType. I tried below query.
db.<collectionName>.aggregate([{$group:{_id:"$acctDetails.accType",totalBalance:{$sum:"$accDetails.balace"}}}])
But it is not giving right result.
I think that this might solve your problem. You first need to use $unwind to transform each array element in a document, then use $group to sum the total balance by account type.
db.collection.aggregate([
{"$unwind": "$acctDetails"},
{
"$group": {
"_id": "$acctDetails.accType",
"totalBalance": {"$sum": "$acctDetails.balance"}
}
}
])
Working Mongo playground
I'm trying to look up all records that match a certain condition, in this case _id being certain values, and then return only the top 2 results, sorted by the name field.
This is what I have
db.getCollection('col1').aggregate([
{$match: {fk: {$in: [1, 2]}}},
{$sort: {fk: 1, name: -1}},
{$group: {_id: "$fk", items: {$push: "$$ROOT"} }},
{$project: {items: {$slice: ["$items", 2]} }}
])
and it works, BUT, it's not guaranteed. According to this Mongo thread $group does not guarantee document order.
This would also mean that all of the suggested solutions here and elsewhere, which recommend using $unwind, followed by $sort, and then $group, would also not work, for the same reason.
What is the best way to accomplish this with Mongo (any version)? I've seen suggestions that this could be accomplished in the $project phase, but I'm not quite sure how.
You are correct in saying that the result of $group is never sorted.
$group does not order its output documents.
Hence doing a;
{$sort: {fk: 1}}
then grouping with
{$group: {_id: "$fk", ... }},
will be a wasted effort.
But there is a silver lining with sorting before $group stage with name: -1. Since you are using $push (not an $addToSet), inserted objects will retain the order they've had in the newly created items array in the $group result. You can see this behaviour here (copy of your pipeline)
The items array will always have;
"items": [
{
..
"name": "Michael"
},
{
..
"name": "George"
}
]
in same order, therefore your nested array sort is a non-issue! Though I am unable to find an exact quote in documentation to confirm this behaviour, you can check;
this,
or this where it is confirmed.
Also, accumulator operator list for $group, where $addToSet has "Order of the array elements is undefined." in its description, whereas the similar operator $push does not, which might be an indirect evidence? :)
Just a simple modification of your pipeline where you move the fk: 1 sort from pre-$group stage to post-$group stage;
db.getCollection('col1').aggregate([
{$match: {fk: {$in: [1, 2]}}},
{$sort: {name: -1}},
{$group: {_id: "$fk", items: {$push: "$$ROOT"} }},
{$sort: {_id: 1}},
{$project: {items: {$slice: ["$items", 2]} }}
])
should be sufficient to have the main result array order fixed as well. Check it on mongoplayground
$group doesn't guarantee the document order but it would keep the grouped documents in the sorted order for each bucket. So in your case even though the documents after $group stage are not sorted by fk but each group (items) would be sorted by name descending. If you would like to keep the documents sorted by fk you could just add the {$sort:{fk:1}} after $group stage
You could also sort by order of values passed in your match query should you need by adding a extra field for each document. Something like
db.getCollection('col1').aggregate([
{$match: {fk: {$in: [1, 2]}}},
{$addField:{ifk:{$indexOfArray:[[1, 2],"$fk"]}}},
{$sort: {ifk: 1, name: -1}},
{$group: {_id: "$ifk", items: {$push: "$$ROOT"}}},
{$sort: {_id : 1}},
{$project: {items: {$slice: ["$items", 2]}}}
])
Update to allow array sort without group operator : I've found the jira which is going to allow sort array.
You could try below $project stage to sort the array.There maybe various way to do it. This should sort names descending.Working but a slower solution.
{"$project":{"items":{"$reduce":{
"input":"$items",
"initialValue":[],
"in":{"$let":{
"vars":{"othis":"$$this","ovalue":"$$value"},
"in":{"$let":{
"vars":{
//return index as 0 when comparing the first value with initial value (empty) or else return the index of value from the accumlator array which is closest and less than the current value.
"index":{"$cond":{
"if":{"$eq":["$$ovalue",[]]},
"then":0,
"else":{"$reduce":{
"input":"$$ovalue",
"initialValue":0,
"in":{"$cond":{
"if":{"$lt":["$$othis.name","$$this.name"]},
"then":{"$add":["$$value",1]},
"else":"$$value"}}}}
}}
},
//insert the current value at the found index
"in":{"$concatArrays":[
{"$slice":["$$ovalue","$$index"]},
["$$othis"],
{"$slice":["$$ovalue",{"$subtract":["$$index",{"$size":"$$ovalue"}]}]}]}
}}}}
}}}}
Simple example with demonstration how each iteration works
db.b.insert({"items":[2,5,4,7,6,3]});
othis ovalue index concat arrays (parts with counts) return value
2 [] 0 [],0 [2] [],0 [2]
5 [2] 0 [],0 [5] [2],-1 [5,2]
4 [5,2] 1 [5],1 [4] [2],-1 [5,4,2]
7 [5,4,2] 0 [],0 [7] [5,4,2],-3 [7,5,4,2]
6 [7,5,4,2] 1 [7],1 [6] [5,4,2],-3 [7,6,5,4,2]
3 [7,6,5,4,2] 4 [7,6,5,4],4 [3] [2],-1 [7,6,5,4,3,2]
Reference - Sorting Array with JavaScript reduce function
There is a bit of a red herring in the question as $group does guarantee that it will be processing incoming documents in order (and that's why you have to sort of them before $group to get an ordered arrays) but there is an issue with the way you propose doing it, since pushing all the documents into a single grouping is (a) inefficient and (b) could potentially exceed maximum document size.
Since you only want top two, for each of the unique fk values, the most efficient way to accomplish it is via a "subquery" using $lookup like this:
db.coll.aggregate([
{$match: {fk: {$in: [1, 2]}}},
{$group:{_id:"$fk"}},
{$sort: {_id: 1}},
{$lookup:{
from:"coll",
as:"items",
let:{fk:"$_id"},
pipeline:[
{$match:{$expr:{$eq:["$fk","$$fk"]}}},
{$sort:{name:-1}},
{$limit:2},
{$project:{_id:0, fk:1, name:1}}
]
}}
])
Assuming you have an index on {fk:1, name:-1} as you must to get efficient sort in your proposed code, the first two stages here will use that index via DISTINCT_SCAN plan which is very efficient, and for each of them, $lookup will use that same index to filter by single value of fk and return results already sorted and limited to first two. This will be the most efficient way to do this at least until https://jira.mongodb.org/browse/SERVER-9377 is implemented by the server.
I have a document with an array (which should be denormalised, but can't be because the reactive events will fire "add" too many times at client startup).
I need to be able to push a document to that array, and keep it in sorted (or roughly sorted) order. I've tried this query:
{ $push: {
'events': {
$each: [{'id': new Mongo.ObjectID, 'start':startDate,...}],
$sort: {'start': 1},
$slice: -1
}
}
But it requires the $slice operator to be present... I don't want to delete all my old data, I just want to be able to insert data into an array, and then have that array be sorted so that I can query the array later and say "slice greater than or equal to time X".
Is this possible?
Edit:
This mongo aggregate query nearly works, except for one level of document in the result array, but aggregating is not reactive (probably because they're expensive computations). Here is the aggregate query if anyone can see how to translate it to a find, or why it can't be translated:
Coll.aggregate({$unwind: '$events'},
{$sort: {'events.start':1}},
{$match: {'events.start': {$gte: new Date()}}},
{$group: {_id: '$_id', 'events': {$push: '$events'} }})