Azure CosmosDB - update the property name - mongodb

I am trying to update the property name of the json in mongodb document.
{
"_id" : ObjectId("1234556789"),
"apps" : [
{
"_id" : 101,
"regions" : [
"WANAE",
"WANAF"
]
},
{
"_id" : 102,
"regions" : [
"WANAE",
"WANAF"
]
}
]
}
in the above josn, I want to change apps regions to codes. Treid below queries but did not work
db.packs.updateMany( {}, { $rename: { 'apps.$.regions': 'apps.$.codes' } } );
db.packs.updateMany( {}, { $rename: { 'apps.$[].regions': 'apps.$[].codes' } } );
any help
Update: As Joe suggested, I have a aggregation that changes the document with the changes needed and I tried updating the entire collection like below with the aggregated result
db.packs.aggregate([
{
$addFields: {
apps: {
$map: {
input: "$apps",
as: "app",
in: {
_id: "$$app._id",
did: "$$app.did",
name: "$$app.name",
codes: "$$app.regions"
}
}
}
}
},
{
$project:{
"apps.regions":0
}
},
{
$out:"packs"
}
])
As per the documentation, $out should replace the existing collection if it is exists but I received an error that says I have to supply a new collection name Please supply a collection that does not already exist to the $out stage.. Isn't $Out replace the exiting packs with new aggregated results

When you reference a field in an array of objects, like "$apps.regions", the value is an array containing all of the values of that field from all of the elements.
If you set the value of regions directly, each sub document will contain an array of arrays, probably not what you want.
renaming the field in the entire array of objects will require iterating the array, perhaps with $map or $reduce.
If you are using MongoDB 4.2, you can do that with a pipeline in an update:
db.packs.updateMany( {}, [
{$set: {
"apps": {
$map: {
input: "$apps",
in: {
$mergeObjects: [
"$$this",
{
codes: "$$this.regions"
}
]
}
}
}
}},
{$unset: "apps.regions"}
]}
If you are using an earlier version, you'll need to do that with aggregation, perhaps with $out, and then replace the original collection with the updated one.

Related

How to query mongodb to fetch results based on values nested parameters?

I am working with MongoDB for the first time.
I have a collection whose each document is roughly of the following form in MongoDB:
{
"name":[
{
"value":"abc",
"created_on":"2020-02-06 06:11:21.340611+00:00"
},
{
"value":"xyz",
"created_on":"2020-02-07 06:11:21.340611+00:00"
}
],
"score":[
{
"value":12,
"created_on":"2020-02-06 06:11:21.340611+00:00"
},
{
"value":13,
"created_on":"2020-02-07 06:11:21.340611+00:00"
}
]
}
How will I form a query so that I get the latest updated values of each field in the given document. I went through Query Embedded Documents, but I wasn't able to figure out how It is.
My expected output is:
{
"name": "xyz",
"score": "13"
}
If you always do push new/latest values to arrays name & score, then you can try below query, it would get last element from array as in general new/latest values will always be added as last element in an array :
db.collection.aggregate([
{ $addFields: { name: { $arrayElemAt: ['$name', -1] }, score: { $arrayElemAt: ['$score', -1] } } },
{ $addFields: { name: '$name.value', score: '$score.value' } }])
Test : MongoDB-Playground

How to get all subdocuments _id into variable

Im trying to get families subdocuments _ids to variable.
Here my schema:
families: [
{
_id: {
type: mongoose.Types.ObjectId
},
name: {
type: String
},
relation: {
type: String
}
}
]
the problem is, i can get the _id of parent to show inside variable, but when im trying to get the families _ids its showing undefined in console log.
What is the proper query to get families subdocuments _ids into variable?
Please try this :
db.yourCollection.aggregate([
{ $unwind: '$families' },
{ $project: { Ids: '$families._id' } }, { $group: { '_id': '$_id', subDocumentsIDs: { $push: '$Ids' } } }
])
Output:
/* 1 */
{
"_id" : ObjectId("5d58d3205a0d22d3c85d16f1"),
"subDocumentsIDs" : [
ObjectId("5d570b350e2fb4f72533d512"),
ObjectId("5d570b350e2fb4f71533d510"),
ObjectId("5d570b350e2fb4172533d511")
]
}
/* 2 */
{
"_id" : ObjectId("5d58d3105a0d22d3c85d1591"),
"subDocumentsIDs" : [
ObjectId("5d570b350e2fb4f72533d312"),
ObjectId("5d570b350e2fb4f71533d310"),
ObjectId("5d570b350e2fb4172533d311")
]
}
Please consider this as a basic example & go ahead with enhancements if anything needed, something like $unwind as an early stage would have performance impacts, if your collection is of large dataset, but you can easily avoid that by using $match as first stage, as you said you're able to get parent _id then use it in $match to filter documents

Mongo - finding records with keys containing dots

Mongo does not allow documents to have dots in their keys (see MongoDB dot (.) in key name or https://softwareengineering.stackexchange.com/questions/286922/inserting-json-document-with-in-key-to-mongodb ).
However we have a huge mongo database where some documents do contain dots in their keys. These documents are of the form:
{
"_id" : NumberLong(2761632),
"data" : {
"field.with.dots" : { ... }
}
}
I don't know how these records got inserted. I suspect that we must have had the check_keys mongod option set to false at some point.
My goal is to find the offending documents, to update them and remove the dots. I haven't found how to perform the search query. Here is what I tried so far:
db.collection.find({"data.field.with.dots" : { $exists : true }})
db.collection.find({"data.field\uff0ewith\uff0edots" : { $exists : true}})
You can use $objectToArray to get your data in form of keys and values. Then you can use $filter with $indexOfBytes to check if there are any keys with . inside of it . In the next step you can use $size to filter out those documents where remaining array is empty (no fields with dots), try:
db.col.aggregate([
{
$addFields: {
dataKv: {
$filter: {
input: { $objectToArray: "$data" },
cond: {
$ne: [ { $indexOfBytes: [ "$$this.k", "." ] } , -1 ]
}
}
}
}
},
{
$match: {
$expr: {
$ne: [ { $size: "$dataKv" }, 0 ]
}
}
},
{
$project: {
dataKv: 0
}
}
])
Mongo playground

Reverse array field in MongoDB

I have a collection with a location field that was entered in the wrong order:
location: [38.7633698, -121.2697997]
When I try to place a 2d index on the field using ...
db.collection.ensureIndex({'location': '2d'});
... I get the following error because the latitude and longitude are reversed.
"err" : "location object expected, location array not in correct format",
"code" : 13654
How can I reverse this array for each document in the mongo shell?
db.loc.find().forEach(function (doc) {
var loc = [ doc.location[1], doc.location[0] ];
db.loc.update(doc, { $set: { location: loc } });
})
Starting from MongoDB 3.4 we can use the $reverseArray operator to do this beautifully.
Reverse the array:
db.collection.aggregate(
[
{ "$project": { "location": { "$reverseArray": "$location" } } }
]
)
which yields:
{
"_id" : ObjectId("576fdc687d33ed2f37a6d527"),
"location" : [ -121.2697997, 38.7633698 ]
}
Update all documents
To update all the documents in your collection, you have a couple of options.
The first is to add a $out stage to your pipeline and replace the old collection. In this case, you will need to explicitly include all the other field in the $projection stage. The $out stage look like this:
{ "$out": "collection" }
Use $push
to reserve the array using the $each & $sort functionality of it
db.getCollection('locations').updateMany({},
{
$push: {
'geo_point.coordinates': { $each: [ ], $sort: -1 }
}
});

MongoDB concatenate strings from two fields into a third field

How do I concatenate values from two string fields and put it into a third one?
I've tried this:
db.collection.update(
{ "_id": { $exists: true } },
{ $set: { column_2: { $add: ['$column_4', '$column_3'] } } },
false, true
)
which doesn't seem to work though, and throws not ok for storage.
I've also tried this:
db.collection.update(
{ "_id": { $exists : true } },
{ $set: { column_2: { $add: ['a', 'b'] } } },
false, true
)
but even this shows the same error not ok for storage.
I want to concatenate only on the mongo server and not in my application.
You can use aggregation operators $project and $concat:
db.collection.aggregate([
{ $project: { newfield: { $concat: [ "$field1", " - ", "$field2" ] } } }
])
Unfortunately, MongoDB currently does not allow you to reference the existing value of any field when performing an update(). There is an existing Jira ticket to add this functionality: see SERVER-1765 for details.
At present, you must do an initial query in order to determine the existing values, and do the string manipulation in the client. I wish I had a better answer for you.
You could use $set like this in 4.2 which supports aggregation pipeline in update.
db.collection.update(
{"_id" :{"$exists":true}},
[{"$set":{"column_2":{"$concat":["$column_4","$column_3"]}}}]
)
Building on the answer from #rebe100x, as suggested by #Jamby ...
You can use $project, $concat and $out (or $merge) in an aggregation pipeline.
https://docs.mongodb.org/v3.0/reference/operator/aggregation/project/
https://docs.mongodb.org/manual/reference/operator/aggregation/concat/
https://docs.mongodb.com/manual/reference/operator/aggregation/out/
For example:
db.collection.aggregate(
[
{ $project: { newfield: { $concat: [ "$field1", " - ", "$field2" ] } } },
{ $out: "collection" }
]
)
With MongoDB 4.2 . . .
MongoDB 4.2 adds the $merge pipeline stage which offers selective replacement of documents within the collection, while $out would replace the entire collection. You also have the option of merging instead of replacing the target document.
db.collection.aggregate(
[
{ $project: { newfield: { $concat: [ "$field1", " - ", "$field2" ] } } },
{ $merge: { into: "collection", on: "_id", whenMatched: "merge", whenNotMatched: "discard" }
]
)
You should consider the trade-offs between performance, concurrency and consistency, when choosing between $merge and $out, since $out will atomically perform the collection replacement via a temporary collection and renaming.
https://docs.mongodb.com/manual/reference/operator/aggregation/merge/
https://docs.mongodb.com/manual/reference/operator/aggregation/merge/#merge-out-comparison
**
in my case this $concat worked for me ...
**
db.collection.update( { "_id" : {"$exists":true} },
[ {
"$set" : {
"column_2" : { "$concat" : ["$column_4","$column_3"] }
}
}
]
let suppose that you have a collection name is "myData" where you have data like this
{
"_id":"xvradt5gtg",
"first_name":"nizam",
"last_name":"khan",
"address":"H-148, Near Hero Show Room, Shahjahanpur",
}
and you want concatenate fields (first_name+ last_name +address) and save it into "address" field like this
{
"_id":"xvradt5gtg",
"first_name":"nizam",
"last_name":"khan",
"address":"nizam khan,H-148, Near Hero Show Room, Shahjahanpur",
}
now write query will be
{
var x=db.myData.find({_id:"xvradt5gtg"});
x.forEach(function(d)
{
var first_name=d.first_name;
var last_name=d.last_name;
var _add=d.address;
var fullAddress=first_name+","+last_name+","+_add;
//you can print also
print(fullAddress);
//update
db.myData.update({_id:d._id},{$set:{address:fullAddress}});
})
}
You can also follow the below.
db.collectionName.find({}).forEach(function(row) {
row.newField = row.field1 + "-" + row.field2
db.collectionName.save(row);
});
Find and Update Each Using For Loop
Try This:
db.getCollection('users').find({ }).forEach( function(user) {
user.full_name = user.first_name + " " + user.last_name;
db.getCollection('users').save(user);
});
Or Try This:
db.getCollection('users').find({ }).forEach( function(user) {
db.getCollection('users').update(
{ _id: user._id },
{ $set: { "full_name": user.first_name + " " + user.last_name } }
)
});