I'm using mongoDB and I have documents similar to the following
{
"files": ["Customers", "Items", "Contacts"],
"counts": [1354, 892, 1542],
...
}
And using an aggregation pipeline stage, I want to convert the above into something more like..
{
"file_info": [
{"file_name": "Customers", "record_counts": 1354},
{"file_name": "Items", "record_counts": 892},
{"file_name": "Contacts", "record_counts": 1542}
]
}
I've tried using $map, $reduce, and $arrayToObject but without any success. What operators can I use to get from where I currently am to where I need to be?
You can use $zip to combine two arrays and $map to get the new structure:
{
$project: {
file_info: {
$map: {
input: { $zip: { inputs: [ "$files", "$counts" ] } },
in: {
file_name: { $arrayElemAt: [ "$$this", 0 ] },
record_counts: { $arrayElemAt: [ "$$this", 1 ] },
}
}
}
}
}
Mongo Playground
Related
In my NodeJs application where I am using MongoDb database and I have two collections products and market_companies both collections have a field name hub_id and dimensions here dimensions is an object field.I have created the aggregation pipeline where in both the collections I am comparing hub_id and dimensions field but here the thing is hub_id in products collection is an integer but in market_companies its an string due to which I am not getting desired output.
I want to know how can I convert hub_id to integer from string in market_companies collection.
Below is my code:
db.products.aggregate([
{
$lookup: {
from: "market_companies",
let: {
hubId: "$hubId",
dimensions: "$dimensions"
},
as: "companies",
pipeline: [
{
$match: {
$expr: {
$and: [
{
$eq: [
"$hubId", // How to convert this into Integer
"$$hubId"
]
},
{
$setEquals: [
{
"$objectToArray": {
$ifNull: [
"$dimensions",
{}
]
}
},
{
"$objectToArray": {
$ifNull: [
"$$dimensions",
{}
]
}
}
]
}
]
}
}
},
]
}
}
])
Someone let me know any help appreciated.
You can do it with $toInt operator:
{
$eq: [
{ $toInt: "$hubId" },
"$$hubId"
]
},
I have a structure where I want to match the value of a field on root level with the value of a field inside another object in the same document, I got to his structure by unwinding on the nested field. So I have a structure like this:
{
"name": "somename",
"level": "123",
"nested":[
{
"somefield": "test",
"file": {
level:"123"
}
},
{
"somefield": "test2",
"file": {
level:"124"
}
}
]
}
After unwinding I got the structure like:
{
"name": "somename",
"level": "123",
"nested": {
"somefield": "test",
"file": {
level:"123"
}
}
}
So I want to match on level = nested.file.level and return only documents which satisfy this condition.
I tried using
$match: {
"nested.file.level": '$level'
}
also
$project: {
nested: {
$cond: [{
$eq: [
'nested.file.level',
'$level'
]
},
'$nested',
null
]
}
}
Nothing seems to work. Any idea on how I can match based on the mentioned criteria?
Solution 1: With $unwind stage
After $unwind stage, in the $match stage you need to use the $expr operator.
{
$match: {
$expr: {
$eq: [
"$nested.file.level",
"$level"
]
}
}
}
Demo Solution 1 # Mongo Playground
Solution 2: Without $unwind stage
Without $unwind stage, you may work with $filter operator.
db.collection.aggregate([
{
$match: {
$expr: {
$in: [
"$level",
"$nested.file.level"
]
}
}
},
{
$project: {
nested: {
$filter: {
input: "$nested",
cond: {
$eq: [
"$$this.file.level",
"$level"
]
}
}
}
}
}
])
Demo Solution 2 # Mongo Playground
I need some help:
I want to optimize this query to be faster , it need to filter by events.eventType:"log" all docs with server:"strong" , but without separate unwind & filter stages , maybe somehow inside the $reduce stage to add $filter.
example single document:
{
server: "strong",
events: [
{
eventType: "log",
createdAt: "2022-01-23T10:26:11.214Z",
visitorInfo: {
visitorId: "JohnID"
}
}
current aggregation query:
db.collection.aggregate([
{
$match: {
server: "strong"
}
},
{
$project: {
total: {
$reduce: {
input: "$events",
initialValue: {
visitor: [],
uniquevisitor: []
},
in: {
visitor: {
$concatArrays: [
"$$value.visitor",
[
"$$this.visitorInfo.visitorId"
]
]
},
uniquevisitor: {
$cond: [
{
$in: [
"$$this.visitorInfo.visitorId",
"$$value.uniquevisitor"
]
},
"$$value.uniquevisitor",
{
$concatArrays: [
"$$value.uniquevisitor",
[
"$$this.visitorInfo.visitorId"
]
]
}
]
}
}
}
}
}
}
])
expected output , two lists with unique visitorId & list of all visitorId:
[
{
"total": {
"uniquevisitor": [
"JohnID"
],
"visitor": [
"JohnID",
"JohnID"
]
}
}
]
playground
In the example query no filter is added for events.eventType:"log" , how can this be implemented without $unwind?
I am not sure this approach is more optimized than yours but might be this will help,
$filter to iterate loop of events and filter by eventType
$let to declare a variable events and store the above filters result
return array of visitor by using dot notation $$events.visitorInfo.visitorId
return array of unique visitor uniquevisitor by using dot notation $$events.visitorInfo.visitorId and $setUnion operator
db.collection.aggregate([
{ $match: { server: "strong" } },
{
$project: {
total: {
$let: {
vars: {
events: {
$filter: {
input: "$events",
cond: { $eq: ["$$this.eventType", "log"] }
}
}
},
in: {
visitor: "$$events.visitorInfo.visitorId",
uniquevisitor: {
$setUnion: "$$events.visitorInfo.visitorId"
}
}
}
}
}
}
])
Playground
Or similar approach without $let and two $project stages,
db.collection.aggregate([
{ $match: { server: "strong" } },
{
$project: {
events: {
$filter: {
input: "$events",
cond: { $eq: ["$$this.eventType", "log"] }
}
}
}
},
{
$project: {
total: {
visitor: "$events.visitorInfo.visitorId",
uniquevisitor: {
$setUnion: "$events.visitorInfo.visitorId"
}
}
}
}
])
Playground
Hi i am trying to use MONGODB query inside TIBCO jasperstudio to create a report
What I am trying to do is filter the data using two parameters #orderitemuid and #ordercatuid. My case is if I put a parameter using #orderitemuid, it will disregard the parameter for #ordercatuid. Vise versa, if I put a parameter using #ordercatuid, it will disregard the parameter for #orderitemuid. But there is also an option when using bot parameters in the query. I used a $switch inside the $match but I am getting an error. Below is the $match I am using
{
$match: {
$switch: {
branches: [
{
case: { $eq: [{ $IfNull: [$P{orderitemuid}, 0] }, 0] },
then: { 'ordcat._id': { '$eq': { '$oid': $P{ordercatuid} } } },
},
{
case: { $eq: [{ $IfNull: [$P{ordercatuid}, 0] }, 0] },
then: { '_id': { '$eq': { '$oid': $P{orderitemuid} } } },
},
],
default: {
$expr: {
$and: [
{ $eq: ['_id', { '$oid': $P{orderitemuid} }] },
{ $eq: ['ordcat_id', { '$oid': $P{ordercatuid} }] },
],
},
},
},
},
}
Thank you in advance
As mentioned in the $match docs
$match takes a document that specifies the query conditions. The query syntax is identical to the read operation query syntax; i.e. $match does not accept raw aggregation expressions. ...
And $switch is an aggregation expressions. this means it cannot be used in a $match stage without being wrapped with $expr.
You can however wrap it with $expr, this will also require you to restructure the return values a little bit, like so:
db.collection.aggregate([
{
$match: {
$expr: {
$switch: {
branches: [
{
case: {
$eq: [
{
$ifNull: [
$P{orderitemuid},
0
]
},
0
]
},
then: {
$eq: [
"$ordcat._id",
{"$oid":$P{ordercatuid}}
]
}
},
{
case: {
$eq: [
{
"$ifNull": [
$P{ordercatuid},
0
]
},
0
]
},
then: {
$eq: [
"$_id",
{"$oid":$P{orderitemuid}}
]
}
}
],
default: {
$and: [
{
$eq: [
"$_id",
{"$oid": $P{orderitemuid} }
]
},
{
$eq: [
"$ordcat_id",
{"$oid": $P{ordercatuid}}
]
}
]
}
}
}
}
}
])
Mongo Playground
I want to find objects that match the following query and then set the events property to an empty array if it matches. Currently the query I'm using will only update the first embedded object in each document, but I need it to check every embedded object in each document. Here is the query, does anyone know how I can make this work?
const date_check = moment().subtract(10, 'minutes').format('X')
await Accumulator.updateMany(
{ 'data.lastUpdate': { $lt: date_check } },
{
$set: {
'data.$.events': []
}
}
)
The document looks like this...
{
bookmaker: 'Bet365',
sport: 'Football',
data: [
{
lastUpdate: '2372273273',
events: [
... // some event objects
]
},
{
lastUpdate: '2372234421',
events: [
... // some event objects
]
},
{
lastUpdate: '2375343461',
events: [
... // some event objects
]
}
]
}
I think its best to change your schema so lastUpdate to be a number
for perfomance, and to avoid bugs, check $toInt, you can do it with code similar to the second query.
Query
arrayfilters way
replace "2372273273" with date_check
filter to keep the documents
use arrayFilters to make the change only the member-documents that pass the filter
db.collection.update({
"data.lastUpdate": {
"$lt": "2372273273"
}
},
{
$set: {
"data.$[data].events": []
}
},
{
"arrayFilters": [
{
"data.lastUpdate": {
"$lt": "2372273273"
}
}
]
})
Query
alternative pipeline update way with $map
replace "2372273273" with date_check
filter to find the document
update only the members that the filter is true
pipeline update requires MongoDB >= 4.2
PlayMongo
db.collection.update({
"data.lastUpdate": {
"$lt": "2372273273"
}
},
[
{
"$set": {
"data": {
"$map": {
"input": "$data",
"in": {
"$cond": [
{
"$lt": [
"$$this.lastUpdate",
"2372273273"
]
},
{
"$mergeObjects": [
"$$this",
{
"events": []
}
]
},
"$$this"
]
}
}
}
}
}
])