Hi guys I want to calulate sum of rate for comments group by type like this
Post::with(['comments' => function ($q) {
$q->selectRaw('type, SUM(rate) as total_rate')
->groupBy('type');
}])
I'm waiting for a result like this:
0 => array:4 [
"id" => 5
"start_date" => "2022-01-01"
"end_date" => "2022-01-31"
"comments" => array:2 [
0 => array:3 [
"type" => "personal"
"total_rate" => 44244.0
]
1 => array:3 [
"type" => "business"
"total_rate" => 22358.0
]
]
but the result is
0 => array:4 [
"id" => 5
"start_date" => "2022-01-01"
"end_date" => "2022-01-31"
"comments" => []
]
I access a lookup field in $project using $unwind but this breaks the accessibility of the other nested fields from the main collection. Is there any way to access the fields from both collections in $project. I thought of merging the arrays but still not sure if it's the right approach.
Users collection
{
"_id" : ObjectId("5a54f739fe0a00373e7ef1e8"),
"team" : {
"name" : "test",
},
"updated_at" : ISODate("2018-05-22T04:28:00Z"),
"created_at" : ISODate("2018-01-09T17:09:13Z"),
"users" : [
{
"updated_at" : ISODate("2018-11-22T11:55:22Z"),
"created_at" : ISODate("2018-01-09T17:09:13Z"),
"_id" : ObjectId("5a54f739fe0a00373e7ef1e9"),
"name" : test,
"status" : "active",
"title" : "Engineer",
},
{
"updated_at" : ISODate("2018-11-22T11:55:22Z"),
"created_at" : ISODate("2018-01-09T17:09:13Z"),
"_id" : ObjectId("5a54f739fe0a00373e7ef1e9"),
"name" : test1,
"status" : "passive",
"title" : "Tester",
}
]
}
Comments collection:
{
"_id" : ObjectId("6062178fc73fe806e45c9b69"),
"userId" : "5a54f739fe0a00373e7ef1e9",
'text' : 'this is a test',
"status" : "1",
"timestamp" : ISODate("2021-03-29T18:08:14.317Z")
}
Pipeline
$pipeline = [['$match' => [
'users' => [
'$elemMatch' => [
'field1' => $field1,
],
]
]
],
['$unwind' => '$users'],
['$match' => [
'users.field1' => $field1,
]
],
['$addFields' => ['userId' => ['$toString' => '$userId' ]]],
['$lookup' => [
'from' => 'comments',
'localField' => 'userId',
'foreignField' => 'userId',
'as' => 'userComments'
]
],
['$unwind' => '$userComments'],
['$project' => [
'comments' => [
'$switch' => [
'branches' => [
[ 'case' => [
'$eq' => ['$userComments.status','verified']
],
'then' => 1],
[ 'case' => [
'$lte' => ['$userComments.status', '']
],
'then' => 1],
],
'default' => 0
]
],
'status' => '$users.status',
'total' => [false],
]
],
['$group' => [
'_id' => $groupBy,
'text' => ['$sum' => '$comments'],
'total' => ['$sum' => '$total'],
'completed' => ['$sum' => '$status'],
]
],
];
result
{"_id" :"categories","text": 21,"total": 100,"completed":50}
I'm using mongo and laravel.
I'm getting data in periods of time, usually 30 days. I want to group the data by day. I've tried $project $dayOfMonth and group by day but it grouping them in order of days in month and I want to be ordered in days in the period.
is there a way?
[
'$match' => [
"created_at" => [
'$gte' => new \MongoDB\BSON\UTCDateTime($thisMonth),
]
],
],
[
'$project' => [
'day' => [
'$dayOfMonth' => [
'date' => '$created_at',
]
],
]
],
[
'$group' => [
'_id' => '$day',
'count' => ['$sum' => 1],
],
],
sample:
{#940
flag::STD_PROP_LIST: false
flag::ARRAY_AS_PROPS: true
iteratorClass: "ArrayIterator"
storage: array:16 [
"_id" => ObjectId {#933
+"oid": "5b2ff00e35826377be16ff82"
}
"orderNumber" => "10000"
"userName" => "dGFraHRlLTkwMDQ2NDcyOJRbugaZlHWdMR+nCNzaUfY="
"updated_at" => UTCDateTime {#938
+"milliseconds": "1529868539000"
}
"created_at" => UTCDateTime {#939
+"milliseconds": "1529868302000"
}
]
}
You can use $dateFromString to converts a date object to a string according to a your desired format.
[ "$match" => [
"created_at" => [ '$gte' => new \MongoDB\BSON\UTCDateTime($thisMonth) ]
]],
[ "$group" => [
"_id" => [ "$dateToString" => [ "format" => "%Y-%m-%d", "date" => "$created_at" ] ],
"count" => [ "$sum" => 1 ]
]]
some mongo document is like that:
{
"_id" : "class1",
"member" : [
[
"Tom",
"1057004"
],
[
"Dean",
"1274858"
],
[
"Bob",
"1276155"
],
[
"Alice",
"1046496"
],
[
"Max",
"1276063"
]
],
"teacher" : "Jack",
"year" : "2014"
}
I want a Fuzzy Search for member's name, for example only search "al" result is this document.
I can use regex for normal search,how to use it in multiple List ?
thanks
Hey i'm having troubles with getting my aggregation right.
I'm having this dataset and within the collection there are a few million other documents alike:
{
"_id": ObjectId("5757c73344ce54ae1d8b456c"),
"hostname": "Baklap4",
"timestamp": NumberLong(1465370500),
"networkList": [
{
"name": "46.243.152.13",
"openConnections": NumberLong(3)
},
{
"name": "46.243.152.50",
"openConnections": NumberLong(4)
}
],
"webserver": "nginx",
"deviceList": [
{
"deviceName": "eth0",
"receive": NumberLong(183263),
"transmit": NumberLong(781595)
},
{
"deviceName": "wlan0",
"receive": NumberLong(0),
"transmit": NumberLong(0)
}
]
}
What I want:
I'd like to get a resultset where i'm doing an average (of every numeric value) for every document within a 300 second timespan.
[
[
'$match' => [
'timestamp' => ['$gte' => $todayMidnight],
'hostname' => $serverName
]
],
[
'$unwind' => '$networkList'
],
[
'$unwind' => '$deviceList'
],
[
'$group' => [
'_id' => [
'interval' => [
'$subtract' => [
'$timestamp',
[
'$mod' => ['$timestamp', 300]
]
]
],
'network' => '$networkList.name',
'device' => '$deviceList.name',
],
'openConnections' => [
'$sum' => '$networkList.openConnections'
],
'cpuLoad' => [
'$avg' => '$cpuLoad'
],
'bytesPerSecond' => [
'$avg' => '$bytesPerSecond'
],
'requestsPerSecond' => [
'$avg' => '$requestsPerSecond'
],
'webserver' => [
'$last' => '$webserver'
],
'timestamp' => [
'$max' => '$timestamp'
]
]
],
[
'$project' => [
'_id' => 0,
'timestamp' => 1,
'cpuLoad' => 1,
'bytesPerSecond' => 1,
'requestsPerSecond' => 1,
'webserver' => 1,
'openConnections' => 1,
'networkList' => '$networkList',
'deviceList' => '$_id.device',
]
],
[
'$sort' => [
'timestamp' => -1
]
]
];
Yet this doesn't give me a list with all devices and per device an average of received and trasmited bytes.
How would one get those?
per given example I was able to get result using this mongo shel query:
var projectTime = {
$project : {
_id : 1,
hostname : 1,
timestamp : 1,
networkList : 1,
webserver : 1,
deviceList : 1,
isoDate : {
$add : [new Date(0), {
$multiply : ["$timestamp", 1000]
}
]
}
}
}
var group = {
$group : {
"_id" : {
time : {
"$add" : [{
"$subtract" : [{
"$subtract" : ["$isoDate", new Date(0)]
}, {
"$mod" : [{
"$subtract" : ["$isoDate", new Date(0)]
},
1000 * 60 * 5 // 1000 milsseconds * 60 seconds * 5 minutes
]
}
]
},
new Date(0)
]
},
"hostname" : "$hostname",
"deviceList_deviceName" : "$deviceList.deviceName",
"networkList_name" : "$networkList.name",
},
xreceive : {
$sum : "$deviceList.receive"
},
xtransmit : {
$sum : "$deviceList.transmit"
},
xopenConnections : {
$avg : "$networkList.openConnections"
},
}
}
var unwindNetworkList = {
$unwind : "$networkList"
}
var unwindSeviceList = {
$unwind : "$deviceList"
}
var match = {
$match : {
"_id.time" : ISODate("2016-06-09T08:05:00.000Z")
}
}
var finalProject = {
$project : {
_id : 0,
timestamp : "$_id.time",
hostname : "$_id.hostname",
deviceList_deviceName : "$_id.deviceList_deviceName",
networkList_name : "$_id.networkList_name",
xreceive : 1,
xtransmit : 1,
xopenConnections : 1
}
}
db.baklap.aggregate([projectTime, unwindNetworkList,
unwindSeviceList,
group,
match,
finalProject
])
db.baklap.findOne()
then output:
{
"xreceive" : NumberLong(0),
"xtransmit" : NumberLong(0),
"xopenConnections" : 4.0,
"timestamp" : ISODate("2016-06-09T08:05:00.000Z"),
"hostname" : "Baklap4",
"deviceList_deviceName" : "wlan0",
"networkList_name" : "46.243.152.50"
}
{
"xreceive" : NumberLong(183263),
"xtransmit" : NumberLong(781595),
"xopenConnections" : 4.0,
"timestamp" : ISODate("2016-06-09T08:05:00.000Z"),
"hostname" : "Baklap4",
"deviceList_deviceName" : "eth0",
"networkList_name" : "46.243.152.50"
}
{
"xreceive" : NumberLong(183263),
"xtransmit" : NumberLong(781595),
"xopenConnections" : 3.0,
"timestamp" : ISODate("2016-06-09T08:05:00.000Z"),
"hostname" : "Baklap4",
"deviceList_deviceName" : "eth0",
"networkList_name" : "46.243.152.13"
}
{
"xreceive" : NumberLong(0),
"xtransmit" : NumberLong(0),
"xopenConnections" : 3.0,
"timestamp" : ISODate("2016-06-09T08:05:00.000Z"),
"hostname" : "Baklap4",
"deviceList_deviceName" : "wlan0",
"networkList_name" : "46.243.152.13"
}
The main point is be aware than every time $unwind is processed, our data gets a bit of pollution. This could give a side effect when summing data (average will be same as (2+2+3+3)/4 is same as (2+3)/2))
To check that - you could add x:{$push:"$$ROOT"} in group stage and check values after pipeline executed - as you will have all source documents for given data peroid