How to compare two arrays using $elemmatch in mongodb? - mongodb

I need to compare two arrays to find whether all elements in the first array matches the second one.
First array:
var tasktime = [2,3,4];
Second array:
'working_days': [
{
'slots': [ 8, 9, 14, 15 ]
}
];
I need to check whether all the elements in the "tasktime" array exists in the "slots" array.
Below is the query I have tried but not getting the expected results.
var defaultCondition = [
{
query: {
"working_days": { $elemMatch: { slots: { $setIntersection: [ 'slots', tasktime ] } } }
}
}
];
db.GetAggregation('tasker', defaultCondition, function (err, taskers) {
if (err || !taskers[0]) {
res.send({ count: 0, result: [] });
} else {
callback(err, taskers);
}
});
Need someone's valuable help on this.

You need to use $all to find all the values in taskTime array.
You can simply do :
db.tasker.find({"working_days.slots" : {$all : taskTime}});

Related

Return an array element of an aggregation in an MongoDB Atlas (4.2) trigger function

So I am currently testing with returning an array element of a an aggregation in an MongoDB Atlas (4.2) trigger function:
exports = function(changeEvent) {
const collection = context.services.get(<clusterName>).db(<dbName>).collection(<collectionName>);
var aggArr = collection.aggregate([
{
$match: { "docType": "record" }
},
..,
{
$group: {
"_id": null,
"avgPrice": {
$avg: "$myAvgPriceFd"
}
}
}
]);
return aggArr;
};
Which outputs:
> result:
[
{
"_id": null,
"avgPrice": {
"$numberDouble": "18.08770081782988165"
}
}
]
> result (JavaScript):
EJSON.parse('[{"_id":null,"avgPrice":{"$numberDouble":"18.08770081782988165"}}]')
As you can see this is returned as one object in an array (I then intend to use the avgPrice value to update a field in a document in the same collection). I have tried to extract the object from the array with aggArr[0] or aggArr(0) - both resulting in:
> result:
{
"$undefined": true
}
> result (JavaScript):
EJSON.parse('{"$undefined":true}')
or by using aggArr[0].avgPrice as per this solution which fails with:
> error:
TypeError: Cannot access member 'avgPrice' of undefined
> trace:
TypeError: Cannot access member 'avgPrice' of undefined
at exports (function.js:81:10(163))
at function_wrapper.js:5:30(18)
at <eval>:13:8(6)
at <eval>:2:15(6)
Any pointers are most welcome because this one has me stumped for now!
I had the same problem, and figured it out. You have to append the .toArray() function to the aggregation call, where you have.
collection.aggregate(pipeline_steps).toArray()
Here's an example:
const user_collection = context.services
.get("mongodb-atlas")
.db("Development")
.collection("users");
const search_params = [
{
"$search": {
"index": 'search_users',
"text": {
"query": value,
"path": [
"email", "first_name", "last_name"
],
"fuzzy":{
"prefixLength": 1,
"maxEdits": 2
}
}
}
}
];
const search_results = await user_collection.aggregate(search_params).toArray();
const results = search_results
return results[0]
Here's the documentation showing how to convert the aggregation to an array.

Modify an element of an array inside an object in MongoDB

I have some documents like this:
doc = {
"tag" : "tag1",
"field" : {
"zone" :"zone1",
"arr" : [
{ vals: [-12.3,-1,0], timestamp: ""},
{ vals: [-30.40,-23.2,0], timestamp: "" }
]
}
}
I want to modify one of the elements of the array (for example, the first element, the one with index 0) of one of such documents.
I want to end it up looking like:
doc = {
"tag" : "tag1",
"field" : {
"zone" :"zone1",
"arr" : [
{ vals: [-1, -1, -1], timestamp: "the_new_timestamp"}, // this one was modified
{ vals: [-30.40, -23.2, 0], timestamp: "" }
]
}
}
I know something about find_and_modify:
db.mycollection.find_and_modify(
query = query, // you find the document of interest with this
fields = { }, // you can focus on one of the fields of your document with this
update = { "$set": data }, // you update your data with this
)
The questions that feel closer to what I want are these:
https://stackoverflow.com/a/28829203/1253729
https://stackoverflow.com/a/23554454/1253729
I've been going through them but I'm getting stuck when trying to work out the solution for my case. I don't know if I should really use the fields parameter. I don't know how to use $set correctly for my case.
I hope you could help me.
https://docs.mongodb.com/manual/reference/operator/update/positional/
This will help you!
db.mycollection.updateOne(
query,
fields,
{ $set: { "field.arr.0.timestamp": "the_new_timestamp"} }
)

List of all sub documents in array

Strangely I couldn't find the answer to that very simple question and I can't find a way to do it by myself with the doc.
This is an example schema
{
Test : [
{
foo:0,fighter:[]
},
{
foo:1,fighter:[]
},
{
food:2,fighter:[]
}
]
}
I want to be able to retrieve all the documents in Test
I've found that to retrieve the content of the first fighter you can just do :
Collection.find({_id: 0 , 'Test.foo': 0})
But what about getting the whole Test array ? or the whole fighter array ?
{ foo: 0, fighter: [ "john", "fitz", "gerald" ] }
{ foo: 1, fighter: [] }
{ food: 2, fighter: [] }
or just the whole figther ( foo 0 ) content
"john", "fitz", "gerald"
The only thing I've found would be Collection.findOne({_id: id}).Test , but it's not working , I'm getting
undefined " Test" method.
You will have to use projection for that. You can do something like this :
Collection.find({_id: 0}, { Test: { $elemMatch: { foo : 0 } } } )
This will give you the Test array, but with just one element, i.e. the first element that matches the criteria foo : 1
To get only the Test array and not the whole document use this :
Collection.find({_id: 0}, { Test: 1, _id: 0 } )
Also
You can find more about this here
Hope this helps.

Merge changeset documents in a query

I have recorded changes from an information system in a mongo database. Every time a set of values are set or changed, a record is saved in the mongo database.
The change collection is in the following form:
{ "user_id": 1, "timestamp": { "date" : "2010-09-22 09:28:02", "timezone_type" : 3, "timezone" : "Europe/Paris" } }, "changes: { "fieldA": "valueA", "fieldB": "valueB", "fieldC": "valueC" } }
{ "user_id": 1, "timestamp": { "date" : "2010-09-24 19:01:52", "timezone_type" : 3, "timezone" : "Europe/Paris" } }, "changes: { "fieldA": "new_valueA", "fieldB": null, "fieldD": "valueD" } }
{ "user_id": 1, "timestamp": { "date" : "2010-10-01 11:11:02", "timezone_type" : 3, "timezone" : "Europe/Paris" } }, "changes: { "fieldD": "new_valueD" } }
Of course there are thousands of records per user with different attributes which represent millions of records. What I want to do is to see a user status at a given time. By example, the user_id 1 at 2010-09-30 would be
fieldA: new_valueA
fieldC: valueC
fieldD: valueD
This means I need to flatten all the changes prior to a given date for a given user into a single record. Can I do that directly in mongo ?
Edit: I am using the 2.0 version of mongodb hence cannot benefit from the aggregation framework.
Edit: It sounds I have found the answer to my question.
var mapTimeAndChangesByUserId = function() {
var key = this.user_id;
var value = { timestamp: this.timestamp.date, changes: this.changes };
emit(key, value);
}
var reduceMergeChanges = function(user_id, changeset) {
var mergeFunction = function(a, b) { for (var attr in b) a[attr] = b[attr]; };
var result = {};
changeset.forEach(function(e) { mergeFunction(result, e.changes); });
return { timestamp: changeset.pop().timestamp, changes: result };
}
The reduce function merges the changes in the order they come and returns the result.
db.user_change.mapReduce(
mapTimeAndChangesByUserId,
reduceMergeChanges,
{
out: { inline: 1 },
query: { user_id: 1, "timestamp.date": { $lt: "2010-09-30" } },
sort: { "timestamp.date": 1 }
});
'results' : [
"_id": 1,
"value": {
"timestamp": "2010-09-24 19:01:52",
"changes": {
"fieldA": "new_valueA",
"fieldB": null,
"fieldC": "valueC",
"fieldD": "valueD"
}
}
]
Which is fine to me.
You could write a MR to do this.
Since the fields are a lot like tags you can modify a nice cookbook example of counting tags here: http://cookbook.mongodb.org/patterns/count_tags/ of course instead of counting you want the latest value applied (assumption since this is not clear in your question) for that field.
So lets get our map function:
map = function() {
if (!this.changes) {
// If there were not changes for some reason lets bail this record
return;
}
// We iterate the changes
for (index in this.changes) {
emit(index /* We emit the field name */, this.changes[index] /* We emit the field value */);
}
}
And now for our reduce:
reduce = function(values){
// This part is dependant upon your input query. If you add a sort of
// date (ts) DESC then you will prolly want the first index (0) not the last as
// gathered here by values.length
return values[values.length];
}
And this will output a single document per field change of the type:
{
_id: your_field_ie_fieldA,
value: whoop
}
You can then iterate the end of the (most likely) in line output and, bam, you have your changes.
This is of course one way of dong it and is not designed to be run completely in line to your app, however that all depends on the size of the data your working on; it could be run very close.
I am unsure whether the group and distinct can run on this but it looks like it might: http://docs.mongodb.org/manual/reference/method/db.collection.group/#db-collection-group however I should note that group is basically a MR wrapper but you could do something like (untested just like the MR above):
db.col.group( {
key: { 'changes.fieldA': 1, // the rest of the fields },
cond: { 'timestamp.date': { $gt: new Date( '01/01/2012' ) } },
reduce: function ( curr, result ) { },
initial: { }
} )
But it does require you to define the keys instead of just iterating them programmably (maybe a better way).

Select top N rows from each group

I use mongodb for my blog platform, where users can create their own blogs. All entries from all blogs are in an entries collection. The document of an entry looks like:
{
'blog_id':xxx,
'timestamp':xxx,
'title':xxx,
'content':xxx
}
As the question says, is there any way to select, say, last 3 entries for each blog?
You need to first sort the documents in the collection by the blog_id and timestamp fields, then do an initial group which creates an array of the original documents in descending order. After that you can slice the array with the documents to return the first 3 elements.
The intuition can be followed in this example:
db.entries.aggregate([
{ '$sort': { 'blog_id': 1, 'timestamp': -1 } },
{
'$group': {
'_id': '$blog_id',
'docs': { '$push': '$$ROOT' },
}
},
{
'$project': {
'top_three': {
'$slice': ['$docs', 3]
}
}
}
])
The only way to do this in basic mongo if you can live with two things :
An additional field in your entry document, let's call it "age"
A new blog entry taking an additional update
If so, here's how you do it :
Upon creating a new intro do your normal insert and then execute this update to increase the age of all posts (including the one you just inserted for this blog) :
db.entries.update({blog_id: BLOG_ID}, {age:{$inc:1}}, false, true)
When querying, use the following query which will return the most recent 3 entries for each blog :
db.entries.find({age:{$lte:3}, timestamp:{$gte:STARTOFMONTH, $lt:ENDOFMONTH}}).sort({blog_id:1, age:1})
Note that this solution is actually concurrency safe (no entries with duplicate ages).
Starting in Mongo 5.2, it's a perfect use case for the new $topN aggregation accumulator:
// { blog_id: "a", title: "plop", content: "smthg" }
// { blog_id: "b", title: "hum", content: "meh" }
// { blog_id: "a", title: "hello", content: "world" }
// { blog_id: "a", title: "what", content: "ever" }
db.collection.aggregate([
{ $group: {
_id: "$blog_id",
messages: { $topN: { n: 2, sortBy: { _id: -1 }, output: "$$ROOT" } }
}}
])
// {
// _id: "a",
// messages: [
// { blog_id: "a", title: "what", content: "ever" },
// { blog_id: "a", title: "hello", content: "world" }
// ]
// }
// {
// _id: "b",
// messages: [
// { blog_id: "b", title: "hum", content: "meh" }
// ]
// }
This applies a $topN group accumulation that:
takes for each group the top 2 (n: 2) elements
top 2, as defined by sortBy: { _id: -1 }, which in this case means by reversed order of insertion
and for each record pushes the whole record in the group's list (output: "$$ROOT") since $$ROOT represents the whole document being processed.
It's possible with group (aggregation), but this will create a full-table scan.
Do you really need exactly 3 or can you set a limit...e.g.: max 3 posts from the last week/month?
This answer using map reduce by drcosta from another question did the trick
In mongo, how do I use map reduce to get a group by ordered by most recent
mapper = function () {
emit(this.category, {top:[this.score]});
}
reducer = function (key, values) {
var scores = [];
values.forEach(
function (obj) {
obj.top.forEach(
function (score) {
scores[scores.length] = score;
});
});
scores.sort();
scores.reverse();
return {top:scores.slice(0, 3)};
}
function find_top_scores(categories) {
var query = [];
db.top_foos.find({_id:{$in:categories}}).forEach(
function (topscores) {
query[query.length] = {
category:topscores._id,
score:{$in:topscores.value.top}
};
});
return db.foo.find({$or:query});