how to convert string to numerical values in mongodb - mongodb

I am trying to convert a string that contains a numerical value to its value in an aggregate query in MongoDB.
Example of document
{
"_id": ObjectId("5522XXXXXXXXXXXX"),
"Date": "2015-04-05",
"PartnerID": "123456",
"moop": "1234"
}
Example of the aggregate query I use
{
aggregate: 'my_collection',
pipeline: [
{$match: {
Date :
{$gt:'2015-04-01',
$lt: '2015-04-05'
}}
},
{$group:
{_id: "$PartnerID",
total:{$sum:'$moop'}
}}]}
where the results are
{
"result": [
{
"_id": "123456",
"total": NumberInt(0)
}
}
How can you convert the string to its numerical value?

MongoDB aggregation not allowed to change existing data type of given fields. In this case you should create some programming code to convert string to int. Check below code
db.collectionName.find().forEach(function(data) {
db.collectionName.update({
"_id": data._id,
"moop": data.moop
}, {
"$set": {
"PartnerID": parseInt(data.PartnerID)
}
});
})
If your collections size more then above script will slow down the performance, for perfomace mongo provide mongo bulk operations, using mongo bulk operations also updated data type
var bulk = db.collectionName.initializeOrderedBulkOp();
var counter = 0;
db.collectionName.find().forEach(function(data) {
var updoc = {
"$set": {}
};
var myKey = "PartnerID";
updoc["$set"][myKey] = parseInt(data.PartnerID);
// queue the update
bulk.find({
"_id": data._id
}).update(updoc);
counter++;
// Drain and re-initialize every 1000 update statements
if (counter % 1000 == 0) {
bulk.execute();
bulk = db.collectionName.initializeOrderedBulkOp();
}
})
// Add the rest in the queue
if (counter % 1000 != 0) bulk.execute();
This basically reduces the amount of operations statements sent to the sever to only sending once every 1000 queued operations.

Using MongoDB 4.0 and newer
You have two options i.e. $toInt or $convert. Using $toInt, follow the example below:
filterDateStage = {
'$match': {
'Date': {
'$gt': '2015-04-01',
'$lt': '2015-04-05'
}
}
};
groupStage = {
'$group': {
'_id': '$PartnerID',
'total': { '$sum': { '$toInt': '$moop' } }
}
};
db.getCollection('my_collection').aggregate([
filterDateStage,
groupStage
])
If the conversion operation encounters an error, the aggregation operation stops and throws an error. To override this behavior, use $convert instead.
Using $convert
groupStage = {
'$group': {
'_id': '$PartnerID',
'total': {
'$sum': {
'$convert': { 'input': '$moop', 'to': 'int' }
}
}
}
};
Using Map/Reduce
With map/reduce you can use javascript functions like parseInt() to do the conversion. As an example, you could define the map function to process each input document:
In the function, this refers to the document that the map-reduce operation is processing. The function maps the converted moop string value to the PartnerID for each document and emits the PartnerID and converted moop pair. This is where the javascript native function parseInt() can be applied:
var mapper = function () {
var x = parseInt(this.moop);
emit(this.PartnerID, x);
};
Next, define the corresponding reduce function with two arguments keyCustId and valuesMoop. valuesMoop is an array whose elements are the integer moop values emitted by the map function and grouped by keyPartnerID.
The function reduces the valuesMoop array to the sum of its elements.
var reducer = function(keyPartnerID, valuesMoop) {
return Array.sum(valuesMoop);
};
db.collection.mapReduce(
mapper,
reducer,
{
out : "example_results",
query: {
Date: {
$gt: "2015-04-01",
$lt: "2015-04-05"
}
}
}
);
db.example_results.find(function (err, docs) {
if(err) console.log(err);
console.log(JSON.stringify(docs));
});
For example, with the following sample collection of documents:
/* 0 */
{
"_id" : ObjectId("550c00f81bcc15211016699b"),
"Date" : "2015-04-04",
"PartnerID" : "123456",
"moop" : "1234"
}
/* 1 */
{
"_id" : ObjectId("550c00f81bcc15211016699c"),
"Date" : "2015-04-03",
"PartnerID" : "123456",
"moop" : "24"
}
/* 2 */
{
"_id" : ObjectId("550c00f81bcc15211016699d"),
"Date" : "2015-04-02",
"PartnerID" : "123457",
"moop" : "21"
}
/* 3 */
{
"_id" : ObjectId("550c00f81bcc15211016699e"),
"Date" : "2015-04-02",
"PartnerID" : "123457",
"moop" : "8"
}
The above Map/Reduce operation will save the results to the example_results collection and the shell command db.example_results.find() will give:
/* 0 */
{
"_id" : "123456",
"value" : 1258
}
/* 1 */
{
"_id" : "123457",
"value" : 29
}

You can easily convert the string data type to numerical data type.
Don't forget to change collectionName & FieldName.
for ex : CollectionNmae : Users & FieldName : Contactno.
Try this query..
db.collectionName.find().forEach( function (x) {
x.FieldName = parseInt(x.FieldName);
db.collectionName.save(x);
});

Eventually I used
db.my_collection.find({moop: {$exists: true}}).forEach(function(obj) {
obj.moop = new NumberInt(obj.moop);
db.my_collection.save(obj);
});
to turn moop from string to integer in my_collection following the example in Simone's answer MongoDB: How to change the type of a field?.

String can be converted to numbers in MongoDB v4.0 using $toInt operator. In this case
db.col.aggregate([
{
$project: {
_id: 0,
moopNumber: { $toInt: "$moop" }
}
}
])
outputs:
{ "moopNumber" : 1234 }

Here is a pure MongoDB based solution for this problem which I just wrote for fun. It's effectively a server-side string-to-number parser which supports positive and negative numbers as well as decimals:
db.collection.aggregate({
$addFields: {
"moop": {
$reduce: {
"input": {
$map: { // split string into char array so we can loop over individual characters
"input": {
$range: [ 0, { $strLenCP: "$moop" } ] // using an array of all numbers from 0 to the length of the string
},
"in":{
$substrCP: [ "$moop", "$$this", 1 ] // return the nth character as the mapped value for the current index
}
}
},
"initialValue": { // initialize the parser with a 0 value
"n": 0, // the current number
"sign": 1, // used for positive/negative numbers
"div": null, // used for shifting on the right side of the decimal separator "."
"mult": 10 // used for shifting on the left side of the decimal separator "."
}, // start with a zero
"in": {
$let: {
"vars": {
"n": {
$switch: { // char-to-number mapping
branches: [
{ "case": { $eq: [ "$$this", "1" ] }, "then": 1 },
{ "case": { $eq: [ "$$this", "2" ] }, "then": 2 },
{ "case": { $eq: [ "$$this", "3" ] }, "then": 3 },
{ "case": { $eq: [ "$$this", "4" ] }, "then": 4 },
{ "case": { $eq: [ "$$this", "5" ] }, "then": 5 },
{ "case": { $eq: [ "$$this", "6" ] }, "then": 6 },
{ "case": { $eq: [ "$$this", "7" ] }, "then": 7 },
{ "case": { $eq: [ "$$this", "8" ] }, "then": 8 },
{ "case": { $eq: [ "$$this", "9" ] }, "then": 9 },
{ "case": { $eq: [ "$$this", "0" ] }, "then": 0 },
{ "case": { $and: [ { $eq: [ "$$this", "-" ] }, { $eq: [ "$$value.n", 0 ] } ] }, "then": "-" }, // we allow a minus sign at the start
{ "case": { $eq: [ "$$this", "." ] }, "then": "." }
],
default: null // marker to skip the current character
}
}
},
"in": {
$switch: {
"branches": [
{
"case": { $eq: [ "$$n", "-" ] },
"then": { // handle negative numbers
"sign": -1, // set sign to -1, the rest stays untouched
"n": "$$value.n",
"div": "$$value.div",
"mult": "$$value.mult",
},
},
{
"case": { $eq: [ "$$n", null ] }, // null is the "ignore this character" marker
"then": "$$value" // no change to current value
},
{
"case": { $eq: [ "$$n", "." ] },
"then": { // handle decimals
"n": "$$value.n",
"sign": "$$value.sign",
"div": 10, // from the decimal separator "." onwards, we start dividing new numbers by some divisor which starts at 10 initially
"mult": 1, // and we stop multiplying the current value by ten
},
},
],
"default": {
"n": {
$add: [
{ $multiply: [ "$$value.n", "$$value.mult" ] }, // multiply the already parsed number by 10 because we're moving one step to the right or by one once we're hitting the decimals section
{ $divide: [ "$$n", { $ifNull: [ "$$value.div", 1 ] } ] } // add the respective numerical value of what we look at currently, potentially divided by a divisor
]
},
"sign": "$$value.sign",
"div": { $multiply: [ "$$value.div" , 10 ] },
"mult": "$$value.mult"
}
}
}
}
}
}
}
}
}, {
$addFields: { // fix sign
"moop": { $multiply: [ "$moop.n", "$moop.sign" ] }
}
})
I am certainly not advertising this as the bee's knees or anything and it might have severe performance implications for larger datasets over a client based solutions but there might be cases where it comes in handy...
The above pipeline will transform the following documents:
{ "moop": "12345" } --> { "moop": 12345 }
and
{ "moop": "123.45" } --> { "moop": 123.45 }
and
{ "moop": "-123.45" } --> { "moop": -123.45 }
and
{ "moop": "2018-01-03" } --> { "moop": 20180103.0 }

Three things need to care for:
parseInt() will store double data type in mongodb. Please use new NumberInt(string).
in Mongo shell command for bulk usage, yield won't work. Please DO NOT add 'yield'.
If you already change string to double by parseInt(). It looks like you have no way to change the type to int directly. The solution is a little bit wired: change double to string first and then change back to int by new NumberInt().

If you can edit all documents in aggregate :
"TimeStamp": {$toDecimal: {$toDate: "$Your Date"}}
And for the client, you set the query :
Date.parse("Your date".toISOString())
That's what makes you whole work with ISODate.

Try:
"TimeStamp":{$toDecimal: { $toDate:"$Datum"}}

Though $toInt is really useful, it was added on mongoDB 4.0, I've run into this same situation in a database running 3.2 which upgrading to use $toInt was not an option due to some other application incompatibilities, so i had to come up with something else, and actually was surprisingly simple.
If you $project and $add zero to your string, it will turn into a number
{
$project : {
'convertedField' : { $add : ["$stringField",0] },
//more fields here...
}
}

It should be saved. It should be like this :
db. my_collection.find({}).forEach(function(theCollection) {
theCollection.moop = parseInt(theCollection.moop);
db.my_collection.save(theCollection);
});

Collation is what you need:
db.collectionName.find().sort({PartnerID: 1}).collation({locale: "en_US", numericOrdering: true})

db.user.find().toArray().filter(a=>a.age>40)

Related

Mongodb: Sort by custom expression

How to sort results by a custom expression which I use in find?
The collection contains documents with the following attributes for example:
{
"_id" : ObjectId("5ef1cd704b35c6d6698f2050"),
"Name" : "TD",
"Date" : ISODate("2021-06-23T09:37:51.976Z"),
"A" : "19.36",
"B" : 2.04,
}
I'm using the following find query to get the records with Date since "2022-01-01" and the ratio between A and B is lower than 0.1:
db.getCollection('my_collection').find(
{
"experationDate" :
{
$gte: new ISODate("2022-01-01 00:00:00.000Z")
},
"$expr":
{
"$lte": [
{ "$divide": ["$A", "$B"] },
0.1
]
}
})
Now, I can't find the right way to sort the results by this ratio.
You can use aggregate in this way:
Search the documents you want using $match, add a field named ratio and use it to sort. And finally, not shown the field using $project:
db.collection.aggregate([
{ "$match": {
"Date": { "$gte": ISODate("2020-01-01") },
"$expr": { "$lte": [ { "$divide": [ "$B", { "$toDouble": "$A" } ] }, 0.1 ] } }
},
{
"$set": {
"ratio": { "$divide": [ "$B", { "$toDouble": "$A" } ] }
}
},
{
"$sort": { "ratio": 1 }
},
{
"$project": { "ratio": 0 }
}
])
Example here
By te way, I've used other values to get results. Ratio between 2.04 and 19.36 is greater than 0.1. You have dividied A/B but I think you mean B/A.
By the way, this is not important, you can change values but the query will still works ok.
Also, maybe this could work better. Is the same query, but could be more efficient (maybe, I don't know) because prevent to divide each value into collection twice:
First filter by date, then add the field ratio to each document found (and in this way is not necessary divide every document). Another filter using the ratio, the sort, and not output the field.
db.collection.aggregate([
{
"$match": { "Date": { "$gte": ISODate("2020-01-01") } }
},
{
"$set": { "ratio": { "$divide": [ "$B", { "$toDouble": "$A" } ] } }
},
{
"$match": { "ratio": { "$lte": 0.1 } }
},
{
"$sort": { "ratio": 1 }
},
{
"$project": { "ratio": 0 }
}
])
Example

SpringData MongoDb, how to count distinct of a query?

I'm doing paginated search with mongoDb in my Springboot API.
For a customer search path, I'm building a query with a bunch of criteria depending on the user input.
I then do a count to display the total number of results (and the computed number of page associated)
Long total = mongoTemplate.count(query, MyEntity.class);
I then do the paginated query to return only current page results
query.with(PageRequest.of(pagination.getPage(), pagination.getPageSize()));
query.with(Sort.by(Sort.Direction.DESC, "creationDate"));
List<MyEntity> listResults = mongoTemplate.find(query, MyEntity.class);
It all works well.
Now on my total results, i often have multiple result for the same users, I want to display those in the paginated list, but I also want to display a new counter with the total distinct user that are in that search.
I saw the findDistinct parameter
mongoTemplate.findDistinct(query, "userId", OnboardingItineraryEntity.class, String.class);
But I do not want to retrieve a huge list and do a count on it. Is there a way to easily do:
mongoTemplate.countDistinct(query, "userId", OnboardingItineraryEntity.class, String.class);
Cause I've a huge number of criteria, so i find it sad to have to rebuild an Aggregate object from scratch ?
Bonus question, sometime userId will be null, Is there an easy way do count number of distinct (not null) + number of null in one query?
Or do I need to do a query, when i add an extra criteira on userId being null, do a count on that, and then do the count distinct on all and add them up manualy in my code (minus one).
MongoDB aggregation solves this problem in several ways.
Aggregate with $type operator:
db.myEntity.aggregate([
{$match:...}, //add here MatchOperation
{
"$group": {
"_id": {
"$type": "$userId"
},
"count": {
"$sum": 1
}
}
}
])
MongoPlayground
---Ouput---
[
{
"_id": "null", //null values
"count": 2
},
{
"_id": "missing", // if userId doesn't exists at all
"count": 1
},
{
"_id": "string", //not null values
"count": 4
}
]
Single document with null and NonNull fields
db.myEntity.aggregate([
{$match:...}, //add here MatchOperation
{
"$group": {
"_id": "",
"null": {
$sum: {
$cond: [
{
$ne: [{ "$type": "$userId"}, "string"]
},
1,
0
]
}
},
"nonNull": {
"$sum": {
$cond: [
{
$eq: [{ "$type": "$userId" }, "string"]
},
1,
0
]
}
}
}
}
])
MongoPlayground
---Output---
[
{
"_id": "",
"nonNull": 4,
"null": 3
}
]
Performing $facet operator
db.myEntity.aggregate([
{$match:...}, //add here MatchOperation
{
$facet: {
"null": [
{
$match: {
$or: [
{
userId: {
$exists: false
}
},
{
userId: null
}
]
}
},
{
$count: "count"
}
],
"nonNull": [
{
$match: {
$and: [
{
userId: {
$exists: true
}
},
{
userId: {
$ne: null
}
}
]
}
},
{
$count: "count"
}
]
}
},
{
$project: {
"null": {
$ifNull: [
{
$arrayElemAt: [
"$null.count",
0
]
},
0
]
},
"nonNull": {
$ifNull: [
{
$arrayElemAt: [
"$nonNull.count",
0
]
},
0
]
}
}
}
])
MongoPlayground
Note: Try any of these solutions and let me know if you have any problem creating the MongoDB aggregation.

how can I create an array based some fields in mongodb?

eg: the input is
{
"_id" : ObjectId("5e79ae1c11344b2895797042"),
"startDay" : "20200312",
"endDay" : "20200314"
}
I want to get ['20200312','20200313','20200314']. What should I do ?
I have used $range which returns Array of numbers between range
and $range accepts only integers. So, I convert the String values of startDay and endDay to integers using $toInt
Try this query,
db.collection.aggregate([
{
"$project": {
"days": {
"$range": [
{
"$toInt": "$startDay"
},
{
"$add": [
{
"$toInt": "$endDay"
},
1
]
}
]
}
}
}
])
Query Result
[
{
"_id": ObjectId("5e79ae1c11344b2895797042"),
"days": [
20200312,
20200313,
20200314
]
}
]
Query Test

Return Only the Keys from Document Where the Query condition was True

I'm having group of elements in MongoDB as given below:
{
"_id": ObjectId("5942643ea2042e12245de00c"),
"user": NumberInt(1),
"name": {
"value": "roy",
"time": NumberInt(121)
},
"lname": {
"value": "roy s",
"time": NumberInt(122)
},
"sname": {
"value": "roy 9",
"time": NumberInt(123)
}
}
but when I execute the query below
db.temp.find({
$or: [{
'name.time': {
$gte: 123
}
}, {
'lname.time': {
$gte: 123
}
}, {
'sname.time': {
$gte: 123
}
}]
})
it is returning the whole document which is correct.
Is there any way to fetch only specified object in which condition matched.Like in my document let condition within lname.time equl to 122 then only lname object will return rest will ignored.
The type of thing you are asking for is only really "practical" with MongoDB 3.4 in order to return this from the server.
Summary
The general case here is that the "projection" of fields by logical conditions is not straightforward. Whilst it would be nice if MongoDB had such a DSL for projection, this is basically delegated either to:
Do your manipulation "after" the results are returned from the server
Use the aggregation pipeline in order to manipulate the documents.
Therefore, in "CASE B" being "aggregation pipeline", this is really only a practical excercise if the steps involved "mimic" the standard .find() behavior of "query" and "project". Introducing other pipeline stages beyond that will only introduce performance problems greatly outweighing any gain from "trimming" the documents to return.
Thus the summary here is $match then $newRoot to "project", following the pattern. It is also I think a good "rule of thumb" to consider here that the aggregation approach "should only" be applied where there is a significant reduction in the size of data returned. I would expand by example saying that "if" the size of the keys to "trim" was actually in the Megabytes range on the returned result, then it is a worthwhile exercise to remove them "on the server".
In the case where such a saving would really only constitute "bytes" in comparison, then the most logical course is to simply allow the documents to return in the cursor "un-altered", and only then in "post processing" would you bother removing unwanted keys that did not meet the logical condition.
That said, On with the actual methods.
Aggregation Case
db.temp.aggregate([
{ "$match": {
"$or": [
{ "name.time": { "$gte": 123 } },
{ "lname.time": { "$gte": 123 } },
{ "sname.time": { "$gte": 123 } }
]
}},
{ "$replaceRoot": {
"newRoot": {
"$arrayToObject": {
"$concatArrays": [
[
{ "k": "_id", "v": "$_id" },
{ "k": "user", "v": "$user" },
],
{ "$filter": {
"input": [
{ "$cond": [
{ "$gte": [ "$name.time", 123 ] },
{ "k": "name", "v": "$name" },
false
]},
{ "$cond": [
{ "$gte": [ "$lname.time", 123 ] },
{ "k": "lname", "v": "$lname" },
false
]},
{ "$cond": [
{ "$gte": [ "$sname.time", 123 ] },
{ "k": "sname", "v": "$sname" },
false
]}
],
"as": "el",
"cond": "$$el"
}}
]
}
}
}}
])
It's a pretty fancy statement that relies on $arrayToObject and $replaceRoot to achieve the dynamic structure. At its core the "keys" are all represented in array form, where the "array" only contains those keys that actually pass the conditions.
Fully constructed after the conditions are filtered we turn the array into a document and return the projection to the new Root.
Cursor Processing Case
You can actually do this in the client code with ease though. For example in JavaScript:
db.temp.find({
"$or": [
{ "name.time": { "$gte": 123 } },
{ "lname.time": { "$gte": 123 } },
{ "sname.time": { "$gte": 123 } }
]
}).map(doc => {
if ( doc.name.time < 123 )
delete doc.name;
if ( doc.lname.time < 123 )
delete doc.lname;
if ( doc.sname.time < 123 )
delete doc.sname;
return doc;
})
In both cases you get the same desired result:
{
"_id" : ObjectId("5942643ea2042e12245de00c"),
"user" : 1,
"sname" : {
"value" : "roy 9",
"time" : 123
}
}
Where sname was the only field to meet the condition in the document and therefore the only one returned.
Dynamic Generation and DSL Re-use
Addressing Sergio's question then I suppose you can actually re-use the DSL from the $or condition to generate in both cases:
Considering the variable defined
var orlogic = [
{
"name.time" : {
"$gte" : 123
}
},
{
"lname.time" : {
"$gte" : 123
}
},
{
"sname.time" : {
"$gte" : 123
}
}
];
Then with cursor iteration:
db.temp.find({
"$or": orlogic
}).map(doc => {
orlogic.forEach(cond => {
Object.keys(cond).forEach(k => {
var split = k.split(".");
var op = Object.keys(cond[k])[0];
if ( op === "$gte" && doc[split[0]][split[1]] < cond[k][op] )
delete doc[split[0]];
else if ( op === "$lte" && doc[split[0]][split[1]] > cond[k][op] )
delete doc[split[0]];
})
});
return doc;
})
Which evaluates against the DSL to actually perform the operations without "hardcoded" ( somewhat ) if statements;
Then the aggregation approach would also be:
var pipeline = [
{ "$match": { "$or": orlogic } },
{ "$replaceRoot": {
"newRoot": {
"$arrayToObject": {
"$concatArrays": [
[
{ "k": "_id", "v": "$_id" },
{ "k": "user", "v": "$user" }
],
{ "$filter": {
"input": orlogic.map(cond => {
var obj = {
"$cond": {
"if": { },
"then": { },
"else": false
}
};
Object.keys(cond).forEach(k => {
var split = k.split(".");
var op = Object.keys(cond[k])[0];
obj.$cond.if[op] = [ `$${k}`, cond[k][op] ];
obj.$cond.then = { "k": split[0], "v": `$${split[0]}` };
});
return obj;
}),
"as": "el",
"cond": "$$el"
}}
]
}
}
}}
];
db.test.aggregate(pipeline);
So the same basic conditions where we re-use existing $or DSL to generate the required pipeline parts as opposed to hard coding them in.
The second argument to find specifies the fields to return (projection)
db.collection.find(query, projection)
https://docs.mongodb.com/manual/reference/method/db.collection.find/
as in example
db.bios.find( { }, { name: 1, contribs: 1 } )
db.temp.find({
"$elemMatch": "$or"[
{
'name.time': {
$gte: 123
}
},
{
'lname.time': {
$gte: 123
}
},
{
'sname.time': {
$gte: 123
}
}
]
},
{
{
"name.time": 1,
"lname.time": 1,
"sname.time": 1
}
}
})
My approach using aggregation pipeline
$project - Project is used to create an key for the documents name, sname and lname
Initial project Query
db.collection.aggregate([{$project: {_id:1, "tempname.name": "$name", "templname.lname":"$lname", "tempsname.sname":"$sname"}}]);
Result of this query is
{"_id":ObjectId("5942643ea2042e12245de00c"),"tempname":{"name":{"value":"roy","time":121}},"templname":{"lname":{"value":"roy s","time":122}},"tempsname":{"sname":{"value":"roy 9","time":123}}}
Use $project one more time to add the documents into an array
db.collection.aggregate([{$project: {_id:1, "tempname.name": "$name", "templname.lname":"$lname", "tempsname.sname":"$sname"}},
{$project: {names: ["$tempname", "$templname", "$tempsname"]}}])
Our document will be like this after the execution of second project
{"_id":ObjectId("5942643ea2042e12245de00c"),"names":[{"name":{"value":"roy","time":121}},{"lname":{"value":"roy s","time":122}},{"sname":{"value":"roy 9","time":123}}]}
Then use $unwind to break the array into separate documents
after breaking the documents use $match with $or to get the desired result
**
Final Query
**
db.collection.aggregate([
{
$project: {
_id: 1,
"tempname.name": "$name",
"templname.lname": "$lname",
"tempsname.sname": "$sname"
}
},
{
$project: {
names: [
"$tempname",
"$templname",
"$tempsname"
]
}
},
{
$unwind: "$names"
},
{
$match: {
$or: [
{
"names.name.time": {
$gte: 123
}
},
{
"names.lname.time": {
$gte: 123
}
},
{
"names.sname.time": {
$gte: 123
}
}
]
}
}
])
Final result of the query closer to your expected result(with an additional key)
{
"_id" : ObjectId("5942643ea2042e12245de00c"),
"names" : {
"sname" : {
"value" : "roy 9",
"time" : 123
}
}
}

MongoDB insert document "or" increment field if exists in array

What I try to do is fairly simple, I have an array inside a document ;
"tags": [
{
"t" : "architecture",
"n" : 12
},
{
"t" : "contemporary",
"n" : 2
},
{
"t" : "creative",
"n" : 1
},
{
"t" : "concrete",
"n" : 3
}
]
I want to push an array of items to array like
["architecture","blabladontexist"]
If item exists, I want to increment object's n value (in this case its architecture),
and if don't, add it as a new Item (with value of n=0) { "t": "blabladontexist", "n":0}
I have tried $addToSet, $set, $inc, $upsert: true with so many combinations and couldn't do it.
How can we do this in MongoDB?
With MongoDB 4.2 and newer, the update method can now take a document or an aggregate pipeline where the following stages can be used:
$addFields and its alias $set
$project and its alias $unset
$replaceRoot and its alias $replaceWith.
Armed with the above, your update operation with the aggregate pipeline will be to override the tags field by concatenating a filtered tags array and a mapped array of the input list with some data lookup in the map:
To start with, the aggregate expression that filters the tags array uses the $filter and it follows:
const myTags = ["architecture", "blabladontexist"];
{
"$filter": {
"input": "$tags",
"cond": {
"$not": [
{ "$in": ["$$this.t", myTags] }
]
}
}
}
which produces the filtered array of documents
[
{ "t" : "contemporary", "n" : 2 },
{ "t" : "creative", "n" : 1 },
{ "t" : "concrete", "n" : 3 }
]
Now the second part will be to derive the other array that will be concatenated to the above. This array requires a $map over the myTags input array as
{
"$map": {
"input": myTags,
"in": {
"$cond": {
"if": { "$in": ["$$this", "$tags.t"] },
"then": {
"t": "$$this",
"n": {
"$sum": [
{
"$arrayElemAt": [
"$tags.n",
{ "$indexOfArray": [ "$tags.t", "$$this" ] }
]
},
1
]
}
},
"else": { "t": "$$this", "n": 0 }
}
}
}
}
The above $map essentially loops over the input array and checks with each element whether it's in the tags array comparing the t property, if it exists then the value of the n field of the subdocument becomes its current n value
expressed with
{
"$arrayElemAt": [
"$tags.n",
{ "$indexOfArray": [ "$tags.t", "$$this" ] }
]
}
else add the default document with an n value of 0.
Overall, your update operation will be as follows
Your final update operation becomes:
const myTags = ["architecture", "blabladontexist"];
db.getCollection('coll').update(
{ "_id": "1234" },
[
{ "$set": {
"tags": {
"$concatArrays": [
{ "$filter": {
"input": "$tags",
"cond": { "$not": [ { "$in": ["$$this.t", myTags] } ] }
} },
{ "$map": {
"input": myTags,
"in": {
"$cond": [
{ "$in": ["$$this", "$tags.t"] },
{ "t": "$$this", "n": {
"$sum": [
{ "$arrayElemAt": [
"$tags.n",
{ "$indexOfArray": [ "$tags.t", "$$this" ] }
] },
1
]
} },
{ "t": "$$this", "n": 0 }
]
}
} }
]
}
} }
],
{ "upsert": true }
);
I don't believe this is possible to do in a single command.
MongoDB doesn't allow a $set (or $setOnInsert) and $inc to affect the same field in a single command.
You'll have to do one update command to attempt to $inc the field, and if that doesn't change any documents (n = 0), do the update to $set the field to it's default value.