I want to find objects that match the following query and then set the events property to an empty array if it matches. Currently the query I'm using will only update the first embedded object in each document, but I need it to check every embedded object in each document. Here is the query, does anyone know how I can make this work?
const date_check = moment().subtract(10, 'minutes').format('X')
await Accumulator.updateMany(
{ 'data.lastUpdate': { $lt: date_check } },
{
$set: {
'data.$.events': []
}
}
)
The document looks like this...
{
bookmaker: 'Bet365',
sport: 'Football',
data: [
{
lastUpdate: '2372273273',
events: [
... // some event objects
]
},
{
lastUpdate: '2372234421',
events: [
... // some event objects
]
},
{
lastUpdate: '2375343461',
events: [
... // some event objects
]
}
]
}
I think its best to change your schema so lastUpdate to be a number
for perfomance, and to avoid bugs, check $toInt, you can do it with code similar to the second query.
Query
arrayfilters way
replace "2372273273" with date_check
filter to keep the documents
use arrayFilters to make the change only the member-documents that pass the filter
db.collection.update({
"data.lastUpdate": {
"$lt": "2372273273"
}
},
{
$set: {
"data.$[data].events": []
}
},
{
"arrayFilters": [
{
"data.lastUpdate": {
"$lt": "2372273273"
}
}
]
})
Query
alternative pipeline update way with $map
replace "2372273273" with date_check
filter to find the document
update only the members that the filter is true
pipeline update requires MongoDB >= 4.2
PlayMongo
db.collection.update({
"data.lastUpdate": {
"$lt": "2372273273"
}
},
[
{
"$set": {
"data": {
"$map": {
"input": "$data",
"in": {
"$cond": [
{
"$lt": [
"$$this.lastUpdate",
"2372273273"
]
},
{
"$mergeObjects": [
"$$this",
{
"events": []
}
]
},
"$$this"
]
}
}
}
}
}
])
Related
This is my collection in my db:
{
"items": [
{
"id": "1",
"audit": [
{
"validFrom": ISODate("2021-01-20T14:24:57.483Z"),
"validTo": ISODate("2024-01-20T14:24:57.483Z")
}
]
},
{
"id": "1",
"audit": [
{
"validFrom": ISODate("2021-01-19T14:24:57.483Z"),
"validTo": ISODate("2024-01-19T14:24:57.483Z")
}
]
}
]
}
Part 1:
I wanted to query validFrom. And while querying, I want to display that specific audit element alone. I tried these queries:
This query returned only the first element that matched the condition
db.Balances.find({"items.audit.validto":{"$lte": ISODate("2024-01-20T14:24:57.483Z")}},{"items.$":1})
This query returned all data of that collection alone irrespective of the filter
db.Balances.find({""items.audit.validto":{"$lte": ISODate("2024-01-20T14:24:57.483Z")}},{"items":1})
Part 2:
After getting the desired result, I want to display the audit list alone instead of the entire item list
Expected Output:
"audit": [
{
"validFrom": ISODate("2021-01-20T14:24:57.483Z"),
"validTo": ISODate("2024-01-20T14:24:57.483Z")
}
]
This is one way of doing it using an aggregation pipeline.
Unwind the items array.
Filter out the elements matching the criteria.
Filter out the elements in the audit array matching the criteria.
db.collection.aggregate([
{
"$unwind": "$items"
},
{
"$match": {
"items.audit.validTo": {
"$lte": ISODate("2024-01-20T14:24:57.483Z")
}
}
},
{
"$project": {
"audit": {
"$filter": {
"input": "$items.audit",
"as": "elem",
"cond": {
"$lte": [
"$$elem.validTo",
ISODate("2024-01-20T14:24:57.483Z")
]
}
}
},
_id: 0
}
}
])
Playground link.
I have a structure where I want to match the value of a field on root level with the value of a field inside another object in the same document, I got to his structure by unwinding on the nested field. So I have a structure like this:
{
"name": "somename",
"level": "123",
"nested":[
{
"somefield": "test",
"file": {
level:"123"
}
},
{
"somefield": "test2",
"file": {
level:"124"
}
}
]
}
After unwinding I got the structure like:
{
"name": "somename",
"level": "123",
"nested": {
"somefield": "test",
"file": {
level:"123"
}
}
}
So I want to match on level = nested.file.level and return only documents which satisfy this condition.
I tried using
$match: {
"nested.file.level": '$level'
}
also
$project: {
nested: {
$cond: [{
$eq: [
'nested.file.level',
'$level'
]
},
'$nested',
null
]
}
}
Nothing seems to work. Any idea on how I can match based on the mentioned criteria?
Solution 1: With $unwind stage
After $unwind stage, in the $match stage you need to use the $expr operator.
{
$match: {
$expr: {
$eq: [
"$nested.file.level",
"$level"
]
}
}
}
Demo Solution 1 # Mongo Playground
Solution 2: Without $unwind stage
Without $unwind stage, you may work with $filter operator.
db.collection.aggregate([
{
$match: {
$expr: {
$in: [
"$level",
"$nested.file.level"
]
}
}
},
{
$project: {
nested: {
$filter: {
input: "$nested",
cond: {
$eq: [
"$$this.file.level",
"$level"
]
}
}
}
}
}
])
Demo Solution 2 # Mongo Playground
I have the following document:
{'software_house': 'k1',
'client_id': '1234',
'transactions': [
{'antecedents': 12345,
'consequents': '015896018',
'antecedent support': 0.0030889166727954697},
{'antecedents': '932696735',
'consequents': '939605046',
'antecedent support': 0.0012502757961314996}
...
]}
In which key 'transactions' stores within an array 3 features, for each item.
I would like to update each item contained in the 'transactions' array, that matches with the same 'software_house', 'client_id', 'transactions.antecedents' and 'transactions.consequents'; and thus:
Overwriting the element within the array if it does exist
Appending a new value within 'transactions' if it doesn't
How could I achieve that using pymongo?
You can do this with an update with aggregation pipeline. You can first $filter the element matched. Then $setUnion with the item you want to upsert
PyMongo:
db.collection.update_many(filter = {
// the criteria you want to match outside array
"software_house": "k1",
"client_id": "1234"
},
update = [
{
"$addFields": {
"transactions": {
"$filter": {
"input": "$transactions",
"as": "t",
"cond": {
$not: {
$and: [
// the criteria you want to match in array
{
$eq: [
"$$t.antecedents",
12345
]
},
{
$eq: [
"$$t.consequents",
"015896018"
]
}
]
}
}
}
}
}
},
{
"$addFields": {
"transactions": {
"$setUnion": [
"$transactions",
[
{
"antecedents": 12345,
"consequents": "the entry you want to upsert",
"antecedent support": -1
}
]
]
}
}
}
])
Native MongoDB query:
db.collection.update({
// the criteria you want to match outside array
"software_house": "k1",
"client_id": "1234"
},
[
{
"$addFields": {
"transactions": {
"$filter": {
"input": "$transactions",
"as": "t",
"cond": {
$not: {
$and: [
// the criteria you want to match in array
{
$eq: [
"$$t.antecedents",
12345
]
},
{
$eq: [
"$$t.consequents",
"015896018"
]
}
]
}
}
}
}
}
},
{
"$addFields": {
"transactions": {
"$setUnion": [
"$transactions",
[
{
"antecedents": 12345,
"consequents": "the entry you want to upsert",
"antecedent support": -1
}
]
]
}
}
}
],
{
multi: true
})
Here is the Mongo playground for your reference.
I have an array which looks like this
const posts = [{ _id: '1', viewsCount: 52 }, ...]
Which corresponds to mongodb documents in posts collection
{
_id: '1',
title: 'some title',
body: '....',
}, ...
I want to perform an aggregation which would result in documents fetched from the posts collection to have a viewsCount field. I'm not sure how I should form my aggregation pipeline:
[
{ $match: {} },
{ $addFields: { viewsCount: ?? } }
]
UPDATE
So far the following code almost does the trick:
[
{ $match: {} },
{ $addFields: { viewsCount: { $arrayElemAt: [posts, { $indexOfArray: [ posts, '$_id' ] } ] } } },
]
But viewsCount in this case turns to be an object, so I guess I need to add $project
UPDATE
I've found out one possible solution which is to use $addFields stage twice - overriding the first viewsCount
[
{ $match: {} },
{ $addFields: { viewsCount: { $arrayElemAt: [posts, { $indexOfArray: [ posts, '$_id' ] } ] } } },
{ $addFields: { viewsCount: '$viewsCount.viewsCount' } }
]
But is there a better/more concise solution?
UPDATE
This pipeline actually works correct:
[
{ $match: {} },
{ $addFields: { viewsCount: { $arrayElemAt: [posts, { $indexOfArray: [ postsIds, '$_id' ] } ] } } },
{ $addFields: { viewsCount: '$viewsCount.viewsCount' } }
]
I have updated the second stage by replacing posts with postsIds
To have a more concise solution (one-stage) you can use $let operator which lets you to define temporary variable that can be then used inside your expression, try:
db.posts.aggregate([
{ $addFields: {
viewsCount: {
$let: {
vars: { viewsCountObj: { $arrayElemAt: [posts, { $indexOfArray: [ posts, '$_id' ] } ] } },
in: "$$viewsCountObj.viewsCount"
}
} }
}
])
I have a MongoDB collection containing elements like this:
{
"name": "test",
"instances": [
{
"year": 2015
},
{
"year": 2016
},
]
}
How can I get the minimum and maximum value for year within the document named test? E.g. I want to aggregate all documents inside that array, but I can't find the syntax for that. Thanks in advance!
Both $min and $max takes an array as a parameter and in your case path instances.year returns an array so your query can look like below:
db.col.aggregate([
{
$match: { name: "test" }
},
{
$project: {
minYear: { $min: "$instances.year" },
maxYear: { $max: "$instances.year" }
}
}
])
You can use below aggregation
db.collection.aggregate([
{ "$project": {
"maxYear": {
"$arrayElemAt": [
"$instances",
{
"$indexOfArray": [
"$instances.year",
{ "$max": "$instances.year" }
]
}
]
},
"minYear": {
"$arrayElemAt": [
"$instances",
{
"$indexOfArray": [
"$instances.year",
{ "$min": "$instances.year" }
]
}
]
}
}}
])