mongo aggregate group and find one - mongodb

I have collections in mongodb: which stores as:
{"tag":"count1","value":100,"ts":1544423706} {"tag":"count2","value":1002,"ts":1544423706} {"tag":"count1","value":101,"ts":1544423806} {"tag":"count2","value":1003,"ts":1544423806} {"tag":"count1","value":102,"ts":1544423906} {"tag":"count2","value":1004,"ts":1544423906}
so my problem is how can I get the result out of "tag" is count1 , "ts" is larger than 1544423800's first item. As I describled: I want to find the result as:
{"tag":"count1","value":101,"ts":1544423806} {"tag":"count2","value":1003,"ts":1544423806}
do I need use aggregate to group the tag and then get the first item which larger than given "ts", I am new to aggregate function in MongoDB.
db.index.aggregate([{"$match": {"tag": {"$in":["count1","count2"]},"ts": {"$gt":1544423800}}
},
{"$group": {"_id": "$tag",
"tags": {"$push": "$$ROOT"}}
},
])
which the result is not one item for each tag so I want to limit one item , what am I have to do
thank you I have solve this by :
db.index.aggregate(
[
{"$match": {"tag": {"$in":["count1","count2"]},
"ts": {"$gt":1545730000}}
},
{"$group": {"_id": "$tag",
"value": {"$first": "$value"}
}
},
]
)

Related

How to merge multiple documents in MongoDB and convert fields' values into fields

I have a MongoDB collection that I have managed to process using an aggregation pipeline to produce the following result:
[
{
_id: 'Complex Numbers',
count: 2
},
{ _id: 'Calculus',
count: 1
}
]
But the result that I am aiming for is something like the following:
{
'Complex Numbers': 2,
'Calculus': 1
}
is there a way to achieve that?
Query
to convert to {} we need somethings like [[k1 v1] ...] OR [{"k" "..." :v "..."}]
first stage
converts each document to [{"k" ".." , "v" ".."}]
then arrayToObject
and replace root
so we have each document like "Complex Numbers": 2
the group is used to combine all those documents in 1 document
and then replace the root with that one document
Test code here
aggregate(
[{"$replaceRoot":
{"newRoot": {"$arrayToObject": [[{"k": "$_id", "v": "$count"}]]}}},
{"$group": {"_id": null, "data": {"$mergeObjects": "$$ROOT"}}},
{"$replaceRoot": {"newRoot": "$data"}}])

Nested grouping in Mongodb

I am working on grouping the MongoDB documents and updating the sample in the below link
https://mongoplayground.net/p/c2s8KmavPBp
I am expecting the output to look like
[
{
"group" : "one",
"details":[{
"avgtime" : 9.359833333333333,
"title": "Scence 1"
},{
"avgtime" : 9.359833333333333,
"title": "Scence 2"
},{
"avgtime" : 9.359833333333333,
"title": "Scence 3"
}]
},
{
"group" : "two",
"details":[{
"avgtime" : 9.359833333333333,
"title": "Scence 1"
}]
}
]
How to rename the field _id to the group and merge the two elements containing title: scence 1 and display their average time
Note: Question and sample link updated
Query
group by both field, to do the calculation
group by the one field and use the calculated avgtime
group the specific(here 2 fields) first and then the more general, using the values from the specific
PlayMongo
aggregate(
[{"$group":
{"_id": {"group": "$group", "title": "$details.title"},
"avgtime":
{"$avg":
{"$divide": [{"$subtract": ["$endTime", "$startTime"]}, 60000]}}}},
{"$group":
{"_id": "$_id.group",
"group": {"$first": "$_id.group"},
"details": {"$push": {"title": "$_id.title", "avgtime": "$avgtime"}}}},
{"$unset": ["_id"]}])

MongoDB Aggregation Lookup with Pipeline Doesn't Work

I have two collections. I am trying to add the documents of Collection 2 to Collection 1, if number 1 and number 2 in Collection 2 is within a certain range as specified in Collection 1. FYI ObjectId in Collection 1 and ObjectId in Collection 2 refer to two different items/products, hence I cannot join the two collections on this id.
Example Document from Collection 1:
{'_id': ObjectId('4321'),
'number1_lb': 61.205672407820025,
'number1_ub': 61.24170844385606,
'number2_lb': -149.75074963516136,
'number2_ub': -149.71471359912533}
Example Document from Collection 2:
{'_id': ObjectId('1234'),
'number1': 1.282298,
'number2': 103.8475}
I want the output:
{'_id': ObjectId('4321'),
'number1_lb': 61.205672407820025,
'number1_ub': 61.24170844385606,
'number2_lb': -149.75074963516136,
'number2_ub': -149.71471359912533,
'recs': [ObjectId('3456'), ObjectId('4567'),...]
I thought that a lookup stage with pipeline would work. My code is currently as follows:
{"$lookup":{
"from": "Collection 2",
"let":{
"number1_lb":"$number1_lb",
"number1_ub":"$number1_ub",
"number2_lb":"$number2_lb",
"number2_ub":"$number2_ub"
},
"pipeline": [
{"$match":
{"$expr":
{"$and":[
{"$gte":["$number1","$$number1_lb"]},
{"$gte":["$number2","$$number2_lb"]},
{"$lte":["$number1","$$number1_ub"]},
{"$lte":["$number2","$$number2_ub"]}
]}}}
],
"as": "recs"
}}
But running the above gives me no output. Am I doing something wrong??
I ran it and it seems to work fine; but I had to tweak your input data in coll1 as it didn't meet the $match the criteria.
from pymongo import MongoClient
from bson.json_util import dumps
db = MongoClient()["testdatabase"]
# Data Setup
db.coll1.replace_one({"_id": "4321"}, {"_id": "4321", "number1_lb": -61.205672407820025, "number1_ub": 61.24170844385606, "number2_lb": -149.75074963516136, "number2_ub": 149.71471359912533}, upsert=True)
db.coll2.replace_one({"_id": "1234"}, {"_id": "1234", "number1": 1.282298, "number2": 103.8475}, upsert=True)
# Run the aggregation
results = db.coll1.aggregate([
{"$lookup": {
"from": "coll2",
"let": {
"number1_lb": "$number1_lb",
"number1_ub": "$number1_ub",
"number2_lb": "$number2_lb",
"number2_ub": "$number2_ub"
},
"pipeline": [
{"$match":
{"$expr":
{"$and": [
{"$gte": ["$number1", "$$number1_lb"]},
{"$gte": ["$number2", "$$number2_lb"]},
{"$lte": ["$number1", "$$number1_ub"]},
{"$lte": ["$number2", "$$number2_ub"]}
]}}}
],
"as": "recs"
}}
])
# pretty up the results
print(dumps(results, indent=4))
gives:
[
{
"_id": "4321",
"number1_lb": -61.205672407820025,
"number1_ub": 61.24170844385606,
"number2_lb": -149.75074963516136,
"number2_ub": 149.71471359912533,
"recs": [
{
"_id": "1234",
"number1": 1.282298,
"number2": 103.8475
}
]
}
]
You are looking to use a $lookup and a $project :
{
$lookup: {
from: "Collection2",
localField: [Foreign Field of the Collection1],
foreignField: [Principal field of the foreign collection here Collection2],
as: "nameJoint"
}
},
{$project: {
"newFieldName":
}},
But to make a joint between 2 document there as to be an commun field between those 2 documents. I am not sure there is one in this situation or I misunderstand it.
(A $lookup is bassicaly a SQL joint in noSQL )

Mongodb aggregation taking more than 15 seconds

I have more than 100k records in my collections, and for every 5 seconds it will add a record into collection. I have a aggregate query to get 720(approx) records from last one year data.
The aggregate query:
db.collectionName.aggregate([
{"$match": {
"Id": "****-id-****",
"receivedDate": {
"$gte": ISODate("2016-06-26T18:30:00.463Z"),
"$lt": ISODate("2017-06-26T18:30:00.463Z")
}
}
},
{"$group": {
"_id": {
"$add": [
{"$subtract": [
{"$subtract": ["$receivedDate", ISODate("1970-01-01T00:00:00.000Z")]},
{"$mod": [
{"$subtract": ["$receivedDate", ISODate("1970-01-01T00:00:00.000Z")]},
43200000
]}
]},
ISODate("1970-01-01T00:00:00.000Z")
]
},
"_rid": {"$first": "$_id"},
"_data": {"$first": "$receivedData.data"},
"count": {"$sum": 1}
}
},
{"$sort": {"_id": -1}},
{"$project": {
"_id": "$_rid",
"receivedDate": "$_id",
"receivedData": {"data": "$_data"}
}
}
])
I am not sure why its taking more than 15 seconds, when I try to get data for 1 month it is working fine.
Its too late to answer this question, This would be helpful for others,
Might be the compound index can help in this situation, Compound indexes can support queries that match on multiple fields.
You can create compound index on Id and receivedDate fields,
db.collectionName.createIndex({ Id: -1, receivedDate: -1 });
The order of the fields listed in a compound index is important. The index will contain references to documents sorted first by the values of the Id field and, within each value of the Id field, sorted by values of the receivedDate field.

MongoDB - Select all documents by the count of an array field

In my current project I have a structure like this:
"squad": {
"members": [
{
"name": "xyz",
"empty": true
},
{
"name": "xyz",
"empty": true
},
{
"name": "xyz",
"empty": true
}
]
}
Now I want to query every squad with mongodb which have at least, lets say 3 empty member slots. I've googled and only found aggregate and $size, which seem to only select an array count not something per field.
Any idea how to do it?
You can try this query :
db.getCollection('collectionName').aggregate([
{$unwind:"$squad.members"},
{$group:{_id:"$_id",count:{$sum:{$cond: [{$eq: ['$squad.members.empty', true]}, 1, 0]}}}},
{$match: {count: {$gte: 3}}}
])
In this query applied conditional sum and then check the count is greater than or equal 3
It will return all documents will empty slots greater than 3
db.squad.aggregate([
{$unwind:"$squad.members"},
{$match:{"squad.members.empty": true}},
{$group:{_id:"$_id",count:{$sum:1}}},
{$match: {count: {$gt: 3}}}
])