I have the following documents in my collection:
[
{"date_time": "2022-11-05 09:09:55", "dat1": "TRUI", "cod": "XC"}
{"date_time": "2022-11-21 09:09:55", "dat1": "TRQW", "cod": "KL"}
{"date_time": "2022-12-06 09:09:55", "dat1": "CBTR", "cod": "NM"}
{"date_time": "2022-12-18 09:09:55", "dat1": "METR", "cod": "XC"}
]
So, I'd like to query my collection to get all documents with the conditions "cod": "XC" and "date_time" between 2022-11-01 to 2022-12-31. The result would be:
[
{"date_time": "2022-12-18 09:09:55", "dat1": "METR", "cod": "XC"}
{"date_time": "2022-11-05 09:09:55", "dat1": "TRUI", "cod": "XC"}
]
How can I achieve the result?
As the date_time field is a String type, you need to convert it from String to DateTime type via $dateFromString operator. The operator is an aggregation operator, thus you need the $expr operator.
db.collection.find({
$expr: {
$and: [
{
$eq: [
"$cod",
"XC"
]
},
{
$and: [
{
$gte: [
{
$dateFromString: {
dateString: "$date_time",
format: "%Y-%m-%d %H:%M:%S"
}
},
ISODate("2022-11-01T00:00:00Z")
]
},
{
$lt: [
{
$dateFromString: {
dateString: "$date_time",
format: "%Y-%m-%d %H:%M:%S"
}
},
ISODate("2023-01-01T00:00:00Z")
]
}
]
}
]
}
})
Demo ($dateFromString) # Mongo Playground
As it is a field for storing date, would suggest storing the value as DateTime type. This will simplify and optimize your query without need to perform the data conversion.
db.collection.find({
cod: "XC",
date_time: {
$gte: ISODate("2022-11-01T00:00:00Z"),
$lt: ISODate("2023-01-01T00:00:00Z")
}
})
Demo # Mongo Playground
Related
I have a MongoDB model that is currently like this (this is the stripped version):
{
title: String,
type: {
type: String,
lowercase: true,
enum: ['event', 'regular', 'project'],
},
project_start_time: Date,
project_end_time: Date,
regular_start_date: Date,
regular_end_date: Date,
events: [{
id: Number,
date: Date
}]
}
Now, I want to query something like this:
Find data where the regular_end_date, project_end_time, and events at the last index are lower than the date provided
The catch is, not every data has the three criteria above because it is available according to the types (Sorry for the messy data, it is already there). Below is an example:
If the data type is an event, then there are events
If the data type is regular, then there are regular_start_date and regular_end_date
If the data type is a project, then there are project_start_date and project_end_date
So far, I've tried to use this:
db.data.find({
"$or": [
{
"project_end_time": {
"$lt": ISODate("2022-12-27T10:09:49.753Z")
},
},
{
"regular_end_date": {
"$lt": ISODate("2022-12-27T10:09:49.753Z")
}
},
{
"$expr": {
"$lt": [
{
"$getField": {
"field": "date",
"input": {
"$last": "$events"
}
}
},
ISODate("2022-12-27T10:09:49.753Z")
]
}
}
]
})
Also with aggregation pipeline:
db.data.aggregate([
{
$match: {
"$or": [{
"project_end_time": {
"$lt": ISODate("2022-12-27T10:09:49.753Z")
},
},
{
"regular_end_date": {
"$lt": ISODate("2022-12-27T10:09:49.753Z")
}
},
{
"$expr": {
"$lt": [{
"$getField": {
"field": "date",
"input": {
"$last": "$events"
}
}
},
ISODate("2022-12-27T10:09:49.753Z")
]}
}]
}
}
])
But it shows all data as if it wasn't filtered according to the criteria. Any idea where did I do wrong?
FYI I am using MongoDB 5.0.2
One option is to check if the relevant field exists before checking its value, otherwise its value is null which is less than your requested date:
db.collection.find({
$or: [
{$and: [
{project_end_time: {$exists: true}},
{project_end_time: {$lt: ISODate("2022-12-27T10:09:49.753Z")}}
]},
{$and: [
{regular_end_date: {$exists: true}},
{regular_end_date: {$lt: ISODate("2022-12-27T10:09:49.753Z")}}
]},
{$and: [
{"events.0": {$exists: true}},
{$expr: {
$lt: [
{$last: "$events.date"},
ISODate("2022-12-27T10:09:49.753Z")
]
}}
]}
]
})
See how it works on the playground example
I want to filter classes that are still on going using $match with moment js. But, the $getField doesn't work as expected.
$match: {
$expr: {
$lt: [
moment(new Date()).valueOf(),
parseInt(moment({ $getField: "classStartDate" }, 'x').add({ $getField: "numberOfWeeks"}, 'weeks').valueOf())
]
}
}
Why so complicated? Simply try this:
$match: {
$expr: {
$lt: [
"$$NOW",
{$add: ["$classStartDate", {$multiply: ["$numberOfWeeks", 1000*60*60*24*7 ]} ] }
]
}
}
As you are obviously using Mongo version 5.0, you can also use
$match: {
$expr: {
$lt: [
"$$NOW",
{ $dateAdd: { startDate: "$classStartDate", unit: "week", amount: "$numberOfWeeks" } }
]
}
}
If $$NOW does not work use moment().toDate()
I've searched for an answer on how to solve my problem but found nothing, so I am very sorry if I am repeating a question that has been asked before.
i'm trying to find a results for a specific userId by month and year.
the dates has been stored in db in this format : yyyy-mm-dd.
I'm trying the following query with Mongo v3.6.8:
db.collection.find({
$and: [
{
$expr: {
$eq: [
{
$month: {
$dateFromString: {
dateString: "$Ntry.BookgDt",
format: "%Y-%m-%d"
}
}
},
12
]
}
},
{
$expr: {
$eq: [
{
$year: {
$dateFromString: {
dateString: "$Ntry.BookgDt",
format: "%Y-%m-%d"
}
}
},
2020
]
}
},
{
$expr: {
$eq: [
"id",
"5fab9a66c493dc4a3c49a7a3"
]
}
}
]
})
Sample data:
[
{
"userid": "5fab9a66c493dc4a3c49a7a3",
"name": "user name",
"acc": "admin",
"Blas": "00.00",
"Ntry": [
{
"Amt": "11.72",
"BookgDt": "2020-08-16",
},
{
"Amt": "16.72",
"BookgDt": "2020-06-23",
}
]
},
{
"userid": "5fab9a77c493dc4a3c49a7a3",
"name": "user name",
"acc": "user",
"Blas": "00.00",
"Ntry": [
{
"Amt": "11.72",
"BookgDt": "2020-08-23",
},
{
"Amt": "16.72",
"BookgDt": "2020-07-23",
}
]
}
]
so my query is to find all Ntry for UserId 5fab9a66c493dc4a3c49a7a3 in month 8 and year 2020, but I got this erorr:
query failed: (ConversionFailure) $dateFromString requires that 'dateString' be a string, found: array with value ["2020-08-16", "2020-06-23"]
can you please help me to find the suitable query, and thank you in advance.
here is also a mongo play ground link, It's the best for quick editing:
This pipeline should work:
db.foo.aggregate([
// Top level match
{$match: {userid: "5fab9a66c493dc4a3c49a7a3"}}
// Next, only keep entries in the Ntry array with month 8 and year 2020.
// addFields: {"Ntry": {$filter: {input: "$Ntry"}}} means overwrite the
// original Ntry array.
,{$addFields: {"Ntry": {$filter: {
input: "$Ntry",
as: "zz",
cond: { $and: [
{$eq: [{$month:{$dateFromString:{dateString:"$$zz.BookgDt",format: "%Y-%m-%d"}}}\
, 8] },
{$eq: [{$year: {$dateFromString:{dateString:"$$zz.BookgDt",format: "%Y-%m-%d"}}}\
, 2020] }
]}
}}
}}
// It is possible everything got filtered out of the Ntry array, leaving
// an empty (size 0) array. We likely do not want that, so further
// cut down the output material. You can comment this out to see what
// changes, especially if you change the month and year targets above.
,{$match: {$expr: {$ne: [ {$size: "$Ntry"}, 0] } }}
]);
It's probably simpler to call $dateFromString twice but if you are feeling adventurous, then use $let inside the cond to convert the date just once:
db.foo.aggregate([
{$match: {userid: "5fab9a66c493dc4a3c49a7a3"}}
,{$addFields: {"Ntry": {$filter: {
input: "$Ntry",
as: "zz",
cond: {
$let: {
vars: {dd: {$dateFromString:{dateString:"$$zz.BookgDt",format: "%Y-%m-%d"}}},
in: {
$and: [
{$eq: [{$month: "$$dd"}, 8] },
{$eq: [{$year: "$$dd"}, 2020] }
]
}
}
}
}}
}}
,{$match: {$expr: {$ne: [ {$size: "$Ntry"}, 0] } }}
]);
I'm querying through Metabase which is connected to a Mongodb server. The field which I'm querying is nested and is a Unix timestamp. See below
{
room_data: {
"meta": {
"xxx_unrecognized": null,
"xxx_sizecache": 0,
"id": "Hke7owir4oejq3bMf",
"createdat": 1565336450838,
"updatedat": 1565336651548,
}
}
}
The query I have written is as follows
[
{
$match: {
client_id: "{{client_id}}",
"room_data.meta.createdat": {
$gt: "{{start}}",
$lt: "{{end}}",
}
}
},
{
$group: {
id: "$room_data.recipe.id",
count: {
$sum: 1
}
}
}
]
I do not get any result as the field room_data.meta.createdat is not a date (Aug 20, 2020) which I'm passing in. Here start and end are the parameters (Metabase feature) which I'm passing in the Date format. I need some help in converting those dates into unix timestamp which can then be used to filter out the results between the specific dates
If you're using Mongo version 4.0+ you can then use $toDate in you're aggregation like so:
db.collection.aggregate([
{
$match: {
$expr: {
$and: [
{
$eq: [
"$client_id",
{{client_id}}
]
},
{
$lt: [
{
$toDate: "$room_data.meta.createdat"
},
{{end}}
]
},
{
$gt: [
{
$toDate: "$room_data.meta.createdat"
},
{{start}}
]
}
]
}
}
}
])
MongoPlayground
If you're you're on an older Mongo version I recommend you either convert you're database fields to be Date type, or you convert your input into a number timestamp somehow (I'm unfamiliar with metabase).
The last option is to use $subtract as you can subtract a number from a date in Mongo, then check to see whether that date is before or after 1970-01-01T00:00:00Z. the problem with this approach is it does not consider timezones, so if your input's timezone is different than your database one or is dynamic this will be a problem you'll have to account for.
db.collection.aggregate([
{
$match: {
$expr: {
$and: [
{
$eq: [
"$client_id",
{{client_id}}
]
},
{
$gt: [
{
"$subtract": [
{{end}},
"$room_data.meta.createdat"
]
},
ISODate("1970-01-01T00:00:00.000Z")
]
},
{
$lt: [
{
"$subtract": [
{{start}},
"$room_data.meta.createdat"
]
},
ISODate("1970-01-01T00:00:00.000Z")
]
}
]
}
}
}
])
MongoPlayground
I have the following situation:
Consider a collection with the following documents:
[
{
'_id': ObjectId('somehting'),
'date': null
},
{
'_id': ObjectId('somehting'),
},
{
'_id': ObjectId('somehting'),
'date': '2015-01-01 12:12:12'
},
many others
]
Now I have the following query that finds documents with date between to values db.getCollection('validation_archive').find({'date': {$lte: '[date_here]', {$gte: '[date_here]'}}});
All works fine, except for the fields with null or nonexistent.
Is there anyway I can tell mongo to treat null as '0000-00-00 00:00:00'?
Edit: I need to do this, so if the date sent in $gt is 0000-00-00 00:00:00, the query returns the document in result.
In a general query then no. You can always exlude them from results as in :
db.getCollection('validation_archive').find({
"date": { "$lte": date_to, "$gte" date_from, "$ne": null }
})
Or you can be "inclusive" with the "zero" or "epoch" date you suggest using .aggregate():
db.getCollection('validation_archive').aggregate([
{ "$redact": {
"$cond": {
"if": {
"$and": [
{ "$gte": [ date_from, { "$ifNull": [ "$date", new Date(0) ] } ] },
{ "$lte": [ date_to, { "$ifNull": [ "$date", new Date(0) ] } ] }
]
},
"then": "$$KEEP",
"else": "$$PRUNE"
}
}}
])
But in the context of what you are asking, then we would have to ask "What is the point".
Or even if you must:
db.getCollection('validation_archive').aggregate([
{ "$project": {
"date": { "$ifNull": [ "$date", new Date(0) ] }
}},
{ "$match": {
"$or": [
{ "date": { "$lte": date_to, "$gte" date_from } },
{ "date": { "$eq": Date(0) } }
]
}}
])
And that is completely inclusive in results.
But then again why not just do:
db.getCollection('validation_archive').find({
"$or": [
{ "date": { "$lte": date_to, "$gte" date_from },
{ "date": null },
{ "date": { "$exists": false } }
]
})
Which is a lot more efficient.
So it is possible to "project" as date where not present, but it mostly makes sense to simply use the basic query operations instead.
Try to put value of date as 0 at the time of insert itself. It'll help you execute the query.
If the collection is already existing you can update the date field by using update queries.
Use mongo's projection technique to insert a new field, say, newDate with value either 0 or actual date. Use filter queries on newDate field after that.
If you want to exclude results having date equals to null or nonexistent, then
db.getCollection('validation_archive').find({
date: {
$ne: null,
$lt: '[date_here]',
$gt: '[date_here]'
}
});