$elemMatch doesn't work on nested documents in MongoDB - mongodb

Stack Overflow!
I have a very strange problem with using $elemMatch in MongoDB. I added multiple documents to a collection. Some of these documents were added using import feature in MongoDB Compass (Add Data -> Import File -> JSON) and some of them were added using insertMany().
Here is an example structure of a single document:
{
"id": "1234567890",
"date": "YYYY-MM-DD",
"contents": {
"0": {
"content": {
"id": "1111111111",
"name": "Name 1"
}
},
"1": {
"content": {
"id": "2222222222",
"name": "Name 2"
}
},
"2": {
"content": {
"id": "3333333333",
"name": "Name 3"
}
}
}
}
The thing is, when I use the following find query using this filter:
{date: "<some_date_here>", "contents": {
$elemMatch: {
"content.id": <some_id_here>
}
}}
ONLY documents that were imported from MongoDB Compass are showing up. Documents that were added by Mongosh or by NodeJS driver (doesn't matter), do NOT show up.
Am I missing something obvious here? What should I do in order to make all documents in a collection (that matches filter) to show up?
Simple filters that do not include $elemMatch work well and all documents that match the filtering rules show up. Problem seems to be with $elemMatch.
I tried adding the same batch of documents using different methods but only direct importing a JSON file in MongoDB Compass make them appear using a filter mentioned above.
Thank you for your help!

$elemMatch if for matching array , and in this case you don't have array
first you should convert contents object to array and then check the query for example id with filter and use match to find all doc that have specific data and size of new filters array
db.collection.aggregate([
{
"$addFields": {
"newField": {
"$objectToArray": "$contents"
}
}
},
{
"$addFields": {
"newField": {
"$filter": {
"input": "$newField",
"as": "z",
"cond": {
$eq: [
"$$z.v.content.id",
"1111111111"
]
}
}
}
}
},
{
"$addFields": {
"newField": {
$size: "$newField"
}
}
},
{
$match: {$and:[ {newField: {
$gt: 0
}},{date:{$gt:Date}}]}
},
{$project:{
contents:1,
date:1,
id:1,
}}
])
https://mongoplayground.net/p/pue4QPp1dYR
in mongoplayground I don't add filter of date

Related

MongoDB Rust Driver weird behavior

There is this weird thing,
I have installed the MongoDB Compass and made a aggregation query that works in the Aggregation tab but now when I use the same query in my rust web server it behaves very weirdly
Original message:
{"_id":{"$oid":"61efd41c56ffe6b1b4a15c7a"},"time":{"$date":"2022-01-25T10:42:36.175Z"},"edited_time":{"$date":"2022-01-30T14:29:54.361Z"},"changes":[],"content":"LORA","author":{"$oid":"61df3cab3087579f8767a38d"}}
Message in MongoDB compass after the query:
{
"_id": {
"$oid": "61efd41c56ffe6b1b4a15c7a"
},
"time": {
"$date": "2022-01-25T10:42:36.175Z"
},
"edited_time": {
"$date": "2021-12-17T09:55:45.856Z"
},
"changes": [{
"time": {
"$date": "2021-12-17T09:55:45.856Z"
},
"change": {
"ChangedContent": "LORA"
}
}],
"content": "LMAO",
"author": {
"$oid": "61df3cab3087579f8767a38d"
}
}
Message after the Web Servers query:
{
"_id": {
"$oid": "61efd41c56ffe6b1b4a15c7a"
},
"time": {
"$date": "2022-01-25T10:42:36.175Z"
},
"edited_time": {
"$date": "2022-01-30T14:40:57.152Z"
},
"changes": {
"$concatArrays": ["$changes", [{
"time": {
"$date": "2022-01-30T14:40:57.152Z"
},
"change": {
"ChangedContent": "$content"
}
}]]
},
"content": "LMAO",
"author": {
"$oid": "61df3cab3087579f8767a38d"
}
}
Pure query in MongoDB Compass:
$set stage
{
"changes": { $concatArrays: [ "$changes", [ { "time": ISODate('2021-12-17T09:55:45.856+00:00'), "change": { "ChangedContent": "$content" } } ] ] },
"edited_time": ISODate('2021-12-17T09:55:45.856+00:00'),
"content": "LMAO",
}
Pure query in Web Server:
let update_doc = doc! {
"$set": {
"changes": {
"$concatArrays": [
"$changes", [
{
"time": now,
"change": {
"ChangedContent": "$content"
}
}
]
]
},
"edited_time": now,
"content": content
}
};
I am using update_one method,
like this
messages.update_one(message_filter, update_doc, None).await?;
I don't really understand, and this happens often, sometimes it fixes it self when I add somewhere randomly some scope in the doc eg.: { } but this time I couldn't figure it out,
I had version of the query with $push but that didn't work too
Is there some fault in the rust driver or am I doing something wrong, are there some rules about formatting when using rust driver that I am missing?
The $set aggregation pipeline stage is different from the $set update operator. And the only difference that I can tell, is the pipeline stage handles $concatArrays while the update operator does not.
$set Aggregation Pipeline Stage
$set appends new fields to existing documents. You can include one or more $set stages in an aggregation operation.
To add field or fields to embedded documents (including documents in arrays) use the dot notation.
To add an element to an existing array field with $set, use with $concatArrays.
$set Update Operator
Starting in MongoDB 5.0, update operators process document fields with
string-based names in lexicographic order. Fields with numeric names
are processed in numeric order.
If the field does not exist, $set will add a new field with the
specified value, provided that the new field does not violate a type
constraint. If you specify a dotted path for a non-existent field,
$set will create the embedded documents as needed to fulfill the
dotted path to the field.
If you specify multiple field-value pairs, $set will update or create
each field.
So if you want to update an existing document by inserting elements into an array field, use the $push update operator (potentially with $each if you're inserting multiple elements):
let update_doc = doc! {
"$set": {
"edited_time": now,
"content": content
},
"$push": {
"changes": {
"time": now,
"change": {
"ChangedContent": "$content"
}
}
}
};
Edit: I missed that $content was supposed to be mapped from the existing field as well. That is not supported by an update document, however MongoDB has support for using an aggregation pipeline to update the document. See: Update MongoDB field using value of another field So you can use the original $set just in a different way:
let update_pipeline = vec![
doc! {
"$set": {
"changes": {
"$concatArrays": [
"$changes", [
{
"time": now,
"change": {
"ChangedContent": "$content"
}
}
]
]
},
"edited_time": now,
"content": content
}
}
];
messages.update_one(message_filter, update_pipeline, None).await?;

Getting concrete elements by element field in mongoDB

I know that by the title it is not very clear what is my problem, so let me explain it with an example.
Let's suppose I have a collection in a mongo database called tweets whose elements look like this:
{
"id": "tweet_id",
"text": "this is the tweet's text",
"user": {
"id": "user_id",
"name": "user_name",
}
}
Let's suppose that we have 100 documents that look like that, and 10 users with diferent ids and names.
How would look a query to know see the different user_id s that exist in the collection and their names?
The result I want would look like this:
{"user_id1": "user_name1"}
{"user_id2": "user_name2"}
...
{"user_id10": "user_name10"}
Thank you for your help
You can use this aggregation query:
First $group by user.id to get all differents user ids with the name.
And then use $replaceRoot with $arrayToObject to get the desired output format.
db.collection.aggregate([
{
"$group": {
"_id": "$user.id",
"name": {
"$first": "$user.name"
}
}
},
{
"$replaceRoot": {
"newRoot": {
"$arrayToObject": [
[
{
"k": "$_id",
"v": "$name"
}
]
]
}
}
}
])
Example here

Mongodb get document that has max value for each subdocument

I have some data looking like this:
{'Type':'A',
'Attributes':[
{'Date':'2021-10-02', 'Value':5},
{'Date':'2021-09-30', 'Value':1},
{'Date':'2021-09-25', 'Value':13}
]
},
{'Type':'B',
'Attributes':[
{'Date':'2021-10-01', 'Value':36},
{'Date':'2021-09-15', 'Value':14},
{'Date':'2021-09-10', 'Value':18}
]
}
I would like to query for each document the document with the newest date. With the data above the desired result would be:
{'Type':'A', 'Date':'2021-10-02', 'Value':5}
{'Type':'B', 'Date':'2021-10-01', 'Value':36}
I managed to find some queries to find over all sub document only the global max. But I did not find the max for each document.
Thanks a lot for your help
Storing date as string is generally considered as bad pratice. Suggest that you change your date field into date type. Fortunately for your case, you are using ISO date format so some effort could be saved.
You can do this in aggregation pipeline:
use $max to find out the max date
use $filter to filter the Attributes array to contains only the latest element
$unwind the array
$project to your expected output
Here is the Mongo playground for your reference.
This keeps 1 member from Attributes only, the one with the max date.
If you want to keep multiple ones use the #ray solution that keeps all members that have the max-date.
*mongoplayground can lose the order, of fields in a document,
if you see wrong result, test it on your driver, its bug of mongoplayground tool
Query1 (local-way)
Test code here
aggregate([
{
"$project": {
"maxDateValue": {
"$max": {
"$map": {
"input": "$Attributes",
"in": { "Date": "$$this.Date", "Value": "$$this.Value" },
}
}
},
"Type": 1
}
},
{
"$project": {
"Date": "$maxDateValue.Date",
"Value": "$maxDateValue.Value"
}
}
])
Query2 (unwind-way)
Test code here
aggregate([
{
"$unwind": { "path": "$Attributes" }
},
{
"$group": {
"_id": "$Type",
"maxDate": {
"$max": {
"Date": "$Attributes.Date",
"Value": "$Attributes.Value"
}
}
}
},
{
"$project": {
"_id": 0,
"Type": "$_id",
"Date": "$maxDate.Date",
"Value": "$maxDate.Value"
}
}
])

Query free timeslots in MongoDB using unix timestamps

I'm new to MongoDB and probably this is a common use case, but I didn't find an answer to this specific case.
I would like to filter out those MongoDB documents where a timeslot in booking is still free, so there is no overlapping with existing bookings.
My collection entries:
{
"_id": "5efc9f89749c983ffd58c55f",
"training": "football",
"bookings": [
{
"from": 1593607419,
"to": 1593622800
}
]
},
{
"_id": "5efc9f89749c983ffd58c55f",
"training": "baseball",
"bookings": [
{
"from": 1593607419,
"to": 1593622800
},
{
"from": 1593687419,
"to": 1593722800
}
]
}
Expected result with input "From: 1593500000" and "To: 1593600000".
{
"_id": "5efc9f89749c983ffd58c55f",
"training": "football"
},
{
"_id": "5efc9f89749c983ffd58c55f",
"training": "baseball"
}
Both documents have free timeslots (no overlapping with existing bookings).
Expected result with input "From: 1593670000" and "To: 1593690000".
{
"_id": "5efc9f89749c983ffd58c55f",
"training": "football"
}
Only football is returned, because the queried values overlaps with the second booking entry of baseball.
Is it possible to do this with a single query? For example using between or something like that?
Or is there a better approach/best practice?
I implement the query with Spring Data REST, but just some help with the mongodb query would already be very helpful.
This is more of a logical problem, I'm not familiar with Spring Data, your query should look like this.
{
bookings: {
$not: {
$elemMatch: {
$or: [{
from: { $gte: start, $lte: end }
}, {
to: { $gte: start, $lte: end }
}]
}
}
}
}
Mongo Playground

MongoDb - How to only return field of nested subdocument when using lookup aggregation?

I'm very new to MongoDb so I'm used to SQL.
Right now I have two collections in my database:
1) Series (which has nested subdocuments)
2) Review (decided to reference to episode subdocument because there will be a lot of reviews)
See this picture for a better understanding.
Now I want to achieve te following. For every review (two in this case), I want to get the episode name.
I tried the following:
db.review.aggregate([
{
$lookup:{
from:"series",
localField:"episode",
foreignField:"seasons.episodes._id",
as:"episode_entry"
}
}
]).pretty()
The problem is that this returns (ofcourse) not only the title of the referenced episode, but it returns the whole season document.
See the picture below for my current output.
I don't know how to achieve it. Please help me.
I'm using Mongo 3.4.9
I would recommend the following series structure which unwinds the season array into multiple documents one for each season.
This will help you with inserting/updating the episodes directly.
Something like
db.series.insertMany([
{
"title": "Sherlock Holmes",
"nr": 1,
"episodes": [
{
"title": "A Study in Pink",
"nr": 1
},
{
"title": "The Blind Banker",
"nr": 2
}
]
},
{
"title": "Sherlock Holmes",
"nr": 2,
"episodes": [
{
"title": "A Scandal in Belgravia",
"nr": 1
},
{
"title": "The Hounds of Baskerville",
"nr": 2
}
]
}
])
The lookup query will do something like this
episode: { $in: [ episodes._id1, episodes._id2, ... ] }
From the docs
If the field holds an array, then the $in operator selects the
documents whose field holds an array that contains at least one
element that matches a value in the specified array (e.g. , , etc.)
So lookup will return all episodes when there is a match. You can then filter to keep only the one matching your review's episode.
So the query will look like
db.review.aggregate([
{
"$lookup": {
"from": "series",
"localField": "episode",
"foreignField": "episodes._id",
"as": "episode_entry"
}
},
{
"$addFields": {
"episode_entry": [
{
"$arrayElemAt": {
"$filter": {
"input": {
"$let": {
"vars": {
"season": {
"$arrayElemAt": [
"$episode_entry",
0
]
}
},
"in": "$$season.episodes"
}
},
"as": "result",
"cond": {
"$eq": [
"$$result._id",
"$episode"
]
}
}
}
},
0
]
}
}
])