I am working on a mongo statement for adding and updating values into an object within my document.
Here is my current statement. field and value changes depending what is getting passed in:
db.collection.update(id, {
$set: {
analysis : {[field]: value}
}
});
Here is an example of what a document could look like(there are potentially 20+ fields in analysis)
{
_id
analysis:{
interest_rate: 22
sales_cost: 4000
value: 300
}
}
The problem is that every time I update the object all fields are removed except the the field I updated.
so if
field = interest_rate
and the new
value = 33
my document would end up looking like this and all the other fields in analysis would be removed:
{
_id
analysis:{
interest_rate: 33
}
}
Is there a way to update fields within an object like this to keep the code simple or will I have to write out update statements for each individual field?
You should use the dot notation to build the path when you're trying to update nested field. Try:
let fieldPath = 'analysis.' + field; // for instance "analysis.interest_rate"
db.collection.update(id, {
$set: {
fieldPath: value
}
});
Otherwise you're just replacing existing analysis object.
Related
How to handle if document structure after production changes.
Suppose I had 500 documents like this:
{
name: ‘n1’
height: ‘h1’
}
Later if I decide to add all the documents in below format:
{
name: ‘n501’
height: ‘h501’
weight: ‘w501’
}
I am using cursor.All(&userDetails) to decode(deserialize) in Go to get the output of the query in struct userDetails. If I modify the structure of further documents and userDetails accordingly, it will fail for the first 500 documents?
How to handle this change?
If you add a new field to your struct, querying old documents will not fail. Since the old documents do not have the new field saved in MongoDB, querying them will give you struct values where the new field will be its zero value. E.g. if its type is string, it will be the empty string "", if it's an int field, it will be 0.
If it bothers you that the old documents do not have this new field, you may extend them in the mongo console like this:
db.mycoll.updateMany({ "weight": {$exists:false} }, { $set: {"weight": ""} } )
This command adds a new weight field to old documents where this field did not exist, setting them to the empty string.
We have to update the bulk values in different arrays.
Is there any way, we can update the values inside the different arrays at the same time ?
Currently, I have the path of the array, Old value and new values, I need to update lot of documents inside one collection - please do let me know if there is a way to do it in Mongodb 3.6 version.
For example:
Path in the collection Old Value New Value
orderDocument.customerOrderItems.customerOrderSubItems.productId 10001 10002
orderDocument.customerOrderItems.customerOrderSubItems.productName Upto 2 Tst/1 Tst Upto 33 Tst/22 Tst
Please share your document structure so we can provide more specific direction. Assuming your document looks like:
{
"orderDocument": {
"customerOrderItems": {
"customerOrderSubItems": [
{
"productId": 10004.0
}
]
}
}
}
Reach the array through a path where value needs to be changed then use the '$' and name of the property to be updated:
db.getCollection('Event').updateMany(
{
"orderDocument.customerOrderItems.customerOrderSubItems.productId": 10004
},
{
$set: {
"orderDocument.customerOrderItems.customerOrderSubItems.$.productId": 10005
}
}
)
This gives the result.
I have a collection with some documents and each document has ha list of objects representing a temporal interval in epoch with a keyword. I want to update the final value of the interval of each object in the right document where the ending value is greater than a value. If nothing can be updated I want to insert a new interval with both start and end as the new value and with the keyword used in the query.
I'm using nifi to perform this task and the update block, with upsert enabled. I can use aggregation too but it should be possible to just do it with an upsert.
This is what I have in place right now
QUERY
{
"_id":"docid",
"thearray.keyword": "red",
"thearray.end": {$gte :minUpdatable}
}
and this as the update body:
UPDATE BODY
{
"thearray.$[].end": valueToUpdate,
"$setOnInsert":{"$push": {"thearray":{"keyword":"red","start":valueToUpdate,"end":valueToUpdate}}}
}
}
This is the document data structure:
INITITAL STATUS
{
_id:"docid",
otherInfo:"",
thearray:[
{keyword:"red",start:8,end:15},
{keyword:"blue",start:8,end:15},
{keyword:"red",start:8,end:9},
{keyword:"red",start:9,end:16},
...
]
}
EXAMPLE 1: In this situation, from initial status, updating with "docid" as the document, "red" as keyword, 12 as lower possible end value to update (minUpdatable) and 22 as the value to set the document should become something like:
{
_id:"docid",
otherInfo:"",
thearray:[
{keyword:"red",start:8,end:22},
{keyword:"blue",start:8,end:15},
{keyword:"red",start:8,end:9},
{keyword:"red",start:9,end:22},
...
]
}
EXAMPLE 2: In the same situation, from initial status, updating with with "docid" as the document, "red" as keyword, 33 as lower possible end value to update and 39 as the value to set the document should result in something like (or any query that result in the impossibility to update) :
{
_id:"docid",
otherInfo:"",
thearray:[
{keyword:"red",start:8,end:15},
{keyword:"blue",start:8,end:15},
{keyword:"red",start:8,end:9},
{keyword:"red",start:9,end:16},
...,
{keyword:"red",start:33,end:33}
]
}
I'm updating many documents using a bulk operation, but I only want to bump the timestamp of documents that are changed by the new values.
Currently my bulk operation looks something like this:
var updates = db.collection.initializeUnorderedBulkOp();
// ...
updates.find( someQuery ).update( {
$set: someValues,
$currentDate: { modified:true }
} );
updates.execute();
Even if someValues doesn't modify any fields, the modified field gets set. Is there a way to provide a list of additional updates to be performed only when the original update results in a change?
I'm running MongoDB 2.6
I want to avoid negating all the object values in the query one by one.
I currently have a collection with documents like the following:
{ foo: 'bar', timeCreated: ISODate("2012-06-28T06:51:48.374Z") }
I would now like to add a timestampCreated key to the documents in this collection, to make querying by time easier.
I was able to add the new column with an update and $set operation, and set the timestamp value but I appears to be setting the current timestamp using this:
db.reports.update({}, {
$set : {
timestampCreated : new Timestamp(new Date('$.timeCreated'), 0)
}
}, false, true);
I however have not been able to figure out a way to add this column and set it's value to the timestamp of the existing 'timeCreated' field.
Do a find for all the documents, limiting to just the id and timeCreated fields. Then loop over that and generate the timestampCreated value, and do an update on each.
Use updateMany() which can accept aggregate pipelines (starting from MongoDB 4.2) and thus take advantage of the $toLong operator which converts a Date into the number of milliseconds since the epoch.
Also use the $type query in the update filter to limit only documents with the timeCreated field and of Date type:
db.reports.updateMany(
{ 'timeCreated': {
'$exists': true,
'$type': 9
} },
[
{ '$set': {
'timestampCreated': { '$toLong': '$timeCreated' }
} }
]
)