In mongo collection I have documents of following structure.
{
"_id" : "Suzuki",
"qty" : 10,
"plates" : [
{
"rego" : "1QX-WA-123",
"date" : 1516374000000.0
},
{
"rego" : "1QX-WA-456",
"date" : 1513369800000.0
}
],
"accounts" : [
{
"_id" : "23kpi9MD4KnTvnaW7",
"createdAt" : 1513810712802.0,
"date" : 1503446400000.0,
"type" : "Suzuki",
"rego" : "1QX-WA-123",
},
{
"_id" : "2Wqrd4yofvLmqLm5H",
"createdAt" : 1513810712802.0,
"date" : 1501632000000.0,
"type" : "Suzuki",
"rego" : "1QX-WA-111",
}
]
}
I am trying to filter objects in accounts array so that it contains only those objects whose rego exists in plates array.
I tried following query, however, it throws an error: all operands of $setIntersection must be arrays. One argument if of type object.
db.getCollection('dummy').aggregate([{
$project: {
plates: 1,
accounts: 1,
intersect: {
$setIntersection: [
{ $arrayElemAt: [ "$plates", 0 ] },
{ $arrayElemAt: [ "$accounts", 4 ] }
]
}
}
}])
The expected output I am looking for is:
{
"_id" : "Suzuki",
"qty" : 10,
"plates" : [
{
"rego" : "1QX-WA-123",
"date" : 1516374000000.0
},
{
"rego" : "1QX-WA-456",
"date" : 1513369800000.0
}
],
"accounts" : [
{
"_id" : "23kpi9MD4KnTvnaW7",
"createdAt" : 1513810712802.0,
"date" : 1503446400000.0,
"type" : "Suzuki",
"rego" : "1QX-WA-123",
}
]
}
So there are a couple of ways, but what you really are after is simply to $filter instead.
Using $in would likely be the first choice:
db.getCollection('dummy').aggregate([
{ "$addFields": {
"accounts": {
"$filter": {
"input": "$accounts",
"cond": {
"$in": [ "$$this.rego", "$plates.rego" ]
}
}
}
}}
])
Or if you don't have MongoDB 3.4 at least, then using $anyElementTrue:
db.getCollection('dummy').aggregate([
{ "$project": {
"qty": 1,
"plates": 1,
"accounts": {
"$filter": {
"input": "$accounts",
"as": "acc",
"cond": {
"$anyElementTrue": {
"$map": {
"input": "$plates.rego",
"as": "rego",
"in": { "$eq": [ "$$rego", "$$acc.rego" ] }
}
}
}
}
}
}}
])
Or even $setIsSubset:
db.getCollection('dummy').aggregate([
{ "$project": {
"qty": 1,
"plates": 1,
"accounts": {
"$filter": {
"input": "$accounts",
"as": "acc",
"cond": {
"$setIsSubset": [ ["$$acc.rego"], "$plates.rego" ]
}
}
}
}}
])
It's really not a $setIntersection for this type of operation, since that would need a comparison on "just the field values" as a "set", and the output is really just "that" and not the "objects".
You could do something silly with matching array indexes to the produced "set" positions:
db.getCollection('dummy').aggregate([
{ "$addFields": {
"accounts": {
"$map": {
"input": { "$setIntersection": ["$plates.rego", "$accounts.rego"] },
"in": {
"$arrayElemAt": [
"$accounts",
{ "$indexOfArray": [ "$accounts.rego", "$$this" ] }
]
}
}
}
}}
])
But in reality you probably really just want the $filter result as being far more practical. And if you want that output as a "set" then you can simply wrap the $filter output with a $setDifference or like operator to make the entries "unique".
In all variations these return:
{
"_id" : "Suzuki",
"qty" : 10.0,
"plates" : [
{
"rego" : "1QX-WA-123",
"date" : 1516374000000.0
},
{
"rego" : "1QX-WA-456",
"date" : 1513369800000.0
}
],
"accounts" : [
{
"_id" : "23kpi9MD4KnTvnaW7",
"createdAt" : 1513810712802.0,
"date" : 1503446400000.0,
"type" : "Suzuki",
"rego" : "1QX-WA-123"
}
]
}
Showing the items in the "accounts" array "filtered" as matching the respective "rego" amounts from the "plates" array.
Related
I have lots of sensors, every sensor report a data every few seconds.
I need to find out the sensors whose data are all zero.
Furthurmore, I need to caculate the zero data ratio for every sensor.
Can any query can do this?
Any help will be highly appreciated.
The records are like
{
"_id" : ObjectId("61353065746e5e18a1d7c4ca"),
"sensor" : "SN54",
"category" : "w",
"data" : "7065",
"time" : ISODate("2021-09-06T05:02:29.308+08:00")
},
{
"_id" : ObjectId("61353065746e5e18a1d7c4c9"),
"sensor" : "SN68",
"category" : "w",
"data" : "0",
"time" : ISODate("2021-09-06T05:02:29.308+08:00")
},
Query (if data was in array (we dont need it here after the question update))
filter to keep the zero only, divides with all array size, and multiply with 100
if you want to get all zero, add a match where percentage=100
Test code here
db.collection.aggregate([
{
"$set": {
"percentage": {
"$multiply": [
{
"$cond": [
{
"$eq": [
"$data",
[]
]
},
0,
{
"$divide": [
{
"$size": {
"$filter": {
"input": "$data",
"as": "d",
"cond": {
"$eq": [
"$$d",
0
]
}
}
}
},
{
"$size": "$data"
}
]
}
]
},
100
]
}
}
}
])
Edit1 (for data that are not inside array)
Test code here
aggregate(
[ {
"$group" : {
"_id" : "$sensor",
"nzero" : {
"$sum" : {
"$cond" : [ {
"$eq" : [ "$data", "0" ]
}, 1, 0 ]
}
},
"count" : {
"$sum" : 1
}
}
}, {
"$set" : {
"sensor" : "$_id"
}
}, {
"$project" : {
"_id" : 0
}
}, {
"$project" : {
"sensor" : 1,
"percentage" : {
"$multiply" : [ {
"$divide" : [ "$nzero", "$count" ]
}, 100 ]
}
}
} ]
)
I have a mongoDB collection called "conference" with an array of participants as below :
[
{
"_id" : 5b894357a0c84d5a5d221f25,
"conferenceName" : "myFirstConference",
"startDate" : 1535722327,
"endDate" : 1535722420,
"participants" : [
{
"name" : "user1",
"origin" : "internal",
"ip" : "192.168.0.2"
},
{
"name" : "user2",
"origin" : "external",
"ip" : "172.20.0.3"
},
]
},
...
]
I would like to get the following result :
[
{
"conferenceName" : "myFirstConference",
"startDate" : 1535722327,
"endDate" : 1535722420,
"internalUsersCount" : 1
"externalUsersCount" : 1,
},
...
]
I tried the request below but it's not working :
db.getCollection("conference").aggregate([
{
$addFields: {
internalUsersCount : {
$size : { "$participants" : {$elemMatch : { origin : "internal" }}}
},
externalUsersCount : {
$size : { "$participants" : {$elemMatch : { origin : "external" }}}
}
}
}
])
How is it possible to count "participant" array elements that match {"origin" : "internal"} and {"origin" : "external"} ?
You need to use $filter aggregation to filter out the external origin and internal origin along with the $size aggregation to calculate the length of the arrays.
Something like this
db.collection.aggregate([
{ "$addFields": {
"internalUsersCount": {
"$size": {
"$filter": {
"input": "$participants",
"as": "part",
"cond": { "$eq": ["$$part.origin", "internal"]}
}
}
},
"externalUsersCount": {
"$size": {
"$filter": {
"input": "$participants",
"as": "part",
"cond": { "$eq": ["$$part.origin", "external"] }
}
}
}
}}
])
Output
[
{
"conferenceName": "myFirstConference",
"endDate": 1535722420,
"externalUsersCount": 1,
"internalUsersCount": 1,
"startDate": 1535722327
}
]
Consider the following data:
{
"_id" : ObjectId("592ffb3d257acc76fc0eecd7"),
"primaryProcessName" : "BI",
"dateTimeStamp" : ISODate("2017-06-01T11:32:12.834+0000"),
"tag" : [
{
"key" : "processname",
"value" : "NEUpdateService",
"value_original" : "NEUpdateService"
},
{
"key" : "processstageid",
"value" : "inprocess",
"value_original" : "InProcess"
},
]
}
{
"_id" : ObjectId("592ffb3d257acc76fc0eecdd"),
"primaryProcessName" : "BI",
"dateTimeStamp" : ISODate("2017-06-01T11:32:13.345+0000"),
"tag" : [
{
"key" : "processname",
"value" : "CommissionPaymentSend",
"value_original" : "CommissionPaymentSend"
},
{
"key" : "processstageid",
"value" : "faulted",
"value_original" : "Faulted"
},
]
}
{
"_id" : ObjectId("592ffb3d257acc76fc0eece4"),
"primaryProcessName" : "BI",
"dateTimeStamp" : ISODate("2017-06-01T11:32:13.745+0000"),
"tag" : [
{
"key" : "processname",
"value" : "commonbusinessintegratorservice",
"value_original" : "CommonBusinessIntegratorService"
},
{
"key" : "processstageid",
"value" : "inprocess",
"value_original" : "InProcess"
},
]
}
{
"_id" : ObjectId("592ffb3d257acc76fc0eecea"),
"primaryProcessName" : "BI",
"dateTimeStamp" : ISODate("2017-06-01T11:32:13.876+0000"),
"tag" : [
{
"key" : "processname",
"value" : "commonbusinessintegratorservice",
"value_original" : "CommonBusinessIntegratorService"
},
{
"key" : "processstageid",
"value" : "inprocess",
"value_original" : "InProcess"
},
]
}
{
"_id" : ObjectId("592ffb3e257acc76fc0eecf1"),
"primaryProcessName" : "BI",
"dateTimeStamp" : ISODate("2017-06-01T11:32:14.193+0000"),
"tag" : [
{
"key" : "processname",
"value" : "SmartComplianceMessenger",
"value_original" : "SmartComplianceMessenger"
},
{
"key" : "processstageid",
"value" : "complete",
"value_original" : "Complete"
},
]
}
I am trying to write a query to aggregate this data to show in the following format:
{
"Total" : 1982, "InProcess" : 991, "Complete" : 991, "Faulted" : 0,
"name" : "SmartComplianceMessenger",
"displayName" : "SmartComplianceMessenger",
"drillDownUrl" : "process/forprimary/name/SmartComplianceMessenger"
},
{
"Total" : 122333, "InProcess" : 56375, "Complete" : 54856, "Faulted" : 11102,
"name" : "NEUpdateService",
"displayName" : "NEUpdateService",
"drillDownUrl" : "process/forprimary/name/NEUpdateService"
},
....
This is what I have so far:
db.ActivityNotice.aggregate([
{$match: {
dateTimeStamp: {
$gte: ISODate("2017-06-01T11:00:00.000Z")
, $lt: ISODate("2017-06-01T11:45:00.000Z")
}
}},
{$group :
{
_id: {process: "$primaryProcessName"} //, status:"$processStageId"
, Total:{$sum:1}
, InProcess: {$sum:0}// { $sum: {$cond: [{$eq: ["$processStageId","InProcess"]},1,0]}}
, Complete: {$sum:0} // { $sum: {$cond: [{$eq: ["$processStageId","Complete"]},1,0]}}
, Faulted: {$sum:0} // { $sum: {$cond: [{$eq: ["$processStageId","Faulted"]},1,0]}}
, Test: { $sum: {$cond: [{$eq: ["tag.key","processstageid"]},1,0]}}
}},
{$project: {
_id: 0,
name: "$_id.process", displayName: "$_id.process",
drillDownUrl: { $concat: [ "process/forprimary/name/", "$_id.process" ] },
Total: 1, InProcess: 1 , Complete: 1, Faulted: 1, Test: 1
}}
])
The challenge I am facing is selecting the value for the "processname" key from tags into a new field, called processName and the value for "processtageid" into a new field so I can do the sum on those values.
Any help would be greatly appreciated.
You want $filter and $size for the most efficient way:
{ "$group": {
"_id": "$primaryProcessName",
"Total": { "$sum": 1 },
"InProcess": {
"$sum": {
"$size": {
"$filter": {
"input": "$tag",
"as": "t",
"cond": {
"$and": [
{ "$eq": [ "$$t.key", "processstageid" ] },
{ "$eq": [ "$$t.value","inprocess"] }
]
}
}
}
}
},
"Complete": {
"$sum": {
"$size": {
"$filter": {
"input": "$tag",
"as": "t",
"cond": {
"$and": [
{ "$eq": [ "$$t.key", "processstageid" ] },
{ "$eq": [ "$$t.value","complete"] }
]
}
}
}
}
},
"Faulted": {
"$sum": {
"$size": {
"$filter": {
"input": "$tag",
"as": "t",
"cond": {
"$and": [
{ "$eq": [ "$$t.key", "processstageid" ] },
{ "$eq": [ "$$t.value","faulted"] }
]
}
}
}
}
}
}}
$filter has it's own condition for which we can use $and to match the multiple conditions of different properties of the array element. This reduces the array to only the entries that match, where you can then take the $size
This is my Document.
{
"_id" : ObjectId("589b6132fafb5a09549b46cb"),
"name" : "foo",
"users" : [
{
"_id" : ObjectId("589b6132fafb5a09549b46cc"),
"name" : "Peter",
"emails" : [
{
"address" : "peter#email.com"
},
{
"address" : "test2#email.com"
}
]
},
{
"_id" : ObjectId("589b6132fafb5a09549b46cd"),
"name" : "Joe",
"emails" : []
}
]
}
I'm unwinding users and users.email
And when I try to regroup, I get a duplicate on user named Peter because it has 2 emails.
Query:
db.test.aggregate([
{ "$unwind": {
"path": "$users",
"preserveNullAndEmptyArrays": true
} },
{ "$unwind": {
"path": "$users.emails",
"preserveNullAndEmptyArrays": true
} },
{
"$group": {
"_id": "$_id",
"name": { "$first": "$name" },
"users": { "$addToSet": "$users"},
"allEmails": { "$push": "$users.emails.address" }
}
}
])
Result:
{
"_id" : ObjectId("589b6132fafb5a09549b46cb"),
"name" : "foo",
"users" : [
{
"_id" : ObjectId("589b6132fafb5a09549b46cd"),
"name" : "Joe"
},
{
"_id" : ObjectId("589b6132fafb5a09549b46cc"),
"name" : "Peter",
"emails" : {
"address" : "test2#email.com"
}
},
{
"_id" : ObjectId("589b6132fafb5a09549b46cc"),
"name" : "Peter",
"emails" : {
"address" : "peter#email.com"
}
}
],
"allEmails" : [
"peter#email.com",
"test2#email.com"
]
}
I need the users object to be exact the same before the unwind with allEmails on the document parent as shown in the following example.
{
"_id" : ObjectId("589b6132fafb5a09549b46cb"),
"name" : "foo",
"users" : [
{
"_id" : ObjectId("589b6132fafb5a09549b46cc"),
"name" : "Peter",
"emails" : [
{ "address" : "test2#email.com" },
{ "address" : "peter#email.com" }
]
},
{
"_id" : ObjectId("589b6132fafb5a09549b46cd"),
"name" : "Joe",
"emails" : []
}
],
"allEmails" : [
"peter#email.com",
"test2#email.com"
]
}
Running the following aggregate pipeline should give you the desired result:
db.test.aggregate([
{
"$addFields": {
"allEmails": {
"$reduce": {
"input": {
"$map": {
"input": "$users",
"as": "user",
"in": "$$user.emails"
}
},
"initialValue": [],
"in": { "$concatArrays": ["$$value", "$$this.address"] }
}
}
}
}
])
The above pipeline works by initially creating a two dimensional array of emails addresses objects using $map. To show an example result produced by apply the expression
{
"$map": {
"input": "$users",
"as": "user",
"in": "$$user.emails"
}
}
run a test pipeline with just a single field that holds the results:
db.test.aggregate([
{
"$project": {
"twoDarray": {
"$map": {
"input": "$users",
"as": "user",
"in": "$$user.emails"
}
}
}
}
}
])
which will produce the 2D array
{
"_id" : ObjectId("589b6132fafb5a09549b46cb"),
"twoDarray" : [
[
{ "address" : "peter#email.com" },
{ "address" : "test2#email.com" }
],
[]
]
}
Now, denormalise this 2-D array
[
[
{ "address" : "peter#email.com" },
{ "address" : "test2#email.com" }
],
[]
]
by using the $reduce operator which applies an expression to each element in an array and combines them into a single value. With the help of the $concatArrays operator, you can concatenate each element within the $reduce expression to form the final desired array
[
"peter#email.com",
"test2#email.com"
]
Structure of mongodb collection is like this.
collection User
{
"name":"sufaid",
"age":"22",
"address":"zzzz",
"product":[{"id":1,"name":"A"},
{"id":6,"name":"N"},
{"id":3,"name":"D"},
{"id":7,"name":"q"},
]
}
I need to find users those who have product id "3"
Out put should be like this
{
"name":"sufaid",
"age":"22",
"address":"zzzz",
"product":{"id":3,"name":"D"}
}
Note : With out using $unwind and projection like "product.$"
"product.$" through error while using pymongo.
Any other option is there ???
use $elemMatch. https://docs.mongodb.com/manual/reference/operator/projection/elemMatch/
for your query:
db.User.find({},{name:1,age:1,address:1,product:{$elemMatch:{id:3}}})
or
db.User.find({},{product:{$elemMatch:{id:3}}})
o/p: {
"name" : "sufaid",
"age" : "22",
"address" : "zzzz",
"product" : [
{
"id" : 3.0,
"name" : "D"
}
]
}
As you require it for aggregation:
db.User.aggregate([
{$unwind:'$product'},
{$match:{'product.id':3}},
{$project:{_id:0,name:1,age:1,aaddress:1,product:1}}
])
o/p:
{
"name" : "sufaid",
"age" : "22",
"address" : "zzzz",
"product" : {
"id" : 3.0,
"name" : "D"
}
}
This will give exactly what you indicated in the question.
You could use the aggregation framework which has a plethora of operators that you can use, in particular you'd need the $filter and $arrayElemAt operators in a $project pipeline.
For instance, you could return just the product field as an embedded document by running the following pipeline:
db.user.aggregate([
{ "$match": { "product.id": 3 } },
{
"$project": {
"name": 1,
"age": 1,
"address": 1,
"product": {
"$arrayElemAt": [
{
"$filter": {
"input": "$product",
"as": "item",
"cond": { "$eq": [ "$$item.id", 3 ] }
}
},
0
]
}
}
}
])
Sample Output
{
"_id" : ObjectId("5829ac89628123dcf8a64b7a"),
"name" : "sufaid",
"age" : "22",
"address" : "zzzz",
"product" : {
"id" : 3,
"name" : "D"
}
}
If you just need an output with the array filtered, skip the $arrayElemAt expression and use the $filter only:
db.user.aggregate([
{ "$match": { "product.id": 3 } },
{
"$project": {
"name": 1,
"age": 1,
"address": 1,
"product": {
"$filter": {
"input": "$product",
"as": "item",
"cond": { "$eq": [ "$$item.id", 3 ] }
}
}
}
}
])
Sample Output
{
"_id" : ObjectId("5829ac89628123dcf8a64b7a"),
"name" : "sufaid",
"age" : "22",
"address" : "zzzz",
"product" : [
{ "id" : 3, "name" : "D" }
]
}
db.User.find({},{product:{$elemMatch:{id:3}}})
it's enough