MongoDB Update for array of object - mongodb

I am having following document in mongodb
{
"_id" : ObjectId("521aff65e4b06121b688f076"),
"uuid" : "160597270101684",
sessionId" : "160597270101684.1",
"stamps" :
{
"currentVisit" : "1377500985",
"lastVisit" : "1377500985"
},
visits : [
{
"page":"google.com",
"method": "GET"
}
]
}
Requirement:
If uuid and sessionId is not present i will insert the document as above otherwise i have to push only the object to visits array.
Any help will be greatful.

MongoDB supports an upsert option on update that updates the matching document if it exists, and inserts a new document if it doesn't exist. In MongoDB 2.4+ you can use the $setOnInsert operator to further tweak this to set certain fields only if the upsert performs an insert.
db.test.update({
uuid: "160597270101684",
sessionId: "160597270101684.1"
}, {
$setOnInsert: {
stamps: {
currentVisit: "1377500985",
lastVisit: "1377500985"
}
},
$push:{
visits: {
page: "google.com",
method: "GET"
}
}
}, { upsert:true })
So in the above example, the $push to visits will always occur but the $setOnInsert to stamps will only occur if the matching document doesn't already exist.

You can achieve this by following upsert query:
db.session.update({"uuid" : "160597270101684", sessionId : "160597270101684.1"},
{$set:"stamps" :{"currentVisit" : "1377500985","lastVisit" : "1377500985"}},
$push :{visits:{"page":"yahoo.com","method": "POST"}}},
{upsert:true})
You can use $addToSet instead of $push if you want to avoid duplicates

Related

exclude find query fields while inserting document using upsert:true

db.getCollection('placeFollow').update(
{
"_id":ObjectId("5af19959204438676c0d5268"),
"count":{"$lt":2}
},
{
"$set":{"data":"check"}
},
{"upsert":true})
Error: E11000 duplicate key error collection: geoFame.placeFollow
index: id dup key: { : ObjectId('5af19959204438676c0d5268') }
Indexes
[
{
"v" : 2,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "geoFame.placeFollow"
}
]
I want to save the document if it doesn't exist. But above query is trying to insert _id, which is given in find query and it is throwing duplicate key error. How to exclude find query while inserting the document?
The core problem is of course supplying the _id, but you need to actually set the "count" field as well. Using $setOnInsert is probably what you want:
db.getCollection('placeFollow').update(
{ "count":{"$lt":2} },
{
"$set":{ "data": "check" },
"$setOnInsert": { "count": 2 }
},
{"upsert":true}
)
So you need to do something like that with whatever "default" or otherwise expected data should be in there. Otherwise MongoDB does not find a document because the field is not there and attempts inserting with the same _id value as provided.
Because your code is currently incorrect you need to remove the document which was created without the "count" first:
db.getCollection('placeFollow').deleteOne(
{ "_id":ObjectId("5af19959204438676c0d5268") }
)
Either that or set all documents with a default value of `count.
But basically you cannot use both an "primary key" and another condition when using "upsert", because if the document does not meet the condition then the upsert is attempted, and of course an existing value for _id is unique.

db.collection.find returns multiple records after addToSet

I use addToSet command (to add UpdatedData):
MyTable.update({ "UrlLink": "http://www.someurl.com"}, {
$addToSet: {
UpdatedData: {
TheDate: ThisDate,
NewInfo: ThisInfo
}
}
},
function (err, result) {
next(err,result)
}
);
But then, when I do the query (after UpdatedData is added to my document), I see that it returns two documents. Instead of only one updated document:
db.MyTable.find( {"UrlLink": "http://www.someurl.com"})
{ "_id" : ObjectId("56485922143a886f66c2f8bb"), "UrlLink" : "http://www.someurl.com", "Stuff" : "One", "UpdatedData" : [ { "TheDate" : "11/15/2015", "NewInfo" : "Info1", "_id" : ObjectId("5648599a71efc79c660f76d3") } ] }
{ "_id" : ObjectId("5648599f71efc79c660f76d4"), "UrlLink" : "http://www.someurl.com", "Stuff": "One", "UpdatedData" : [ ] }
So it seems that addToSet creates a new document with new _id, instead of updating the old record (ObjectId("5648599f71efc79c660f76d4")). But I only see the updated document in robomongo (ObjectId("56485922143a886f66c2f8bb")). Any ideas why this happens and how I could prevent that behaviour?
update cannot create a new document, it can only update existing.
This looks like you have created two documents with the same url.. Then when you update it just updates the first one..
To prevent the creation of a document with an already existent url you can create an index and set it to unique
db.collection.createIndex({ UrlLink: 1 }, { unique: true })
This will prevent creation of new documents with the same url, and it will also make queries by UrlLink as fast as possible.

How to remove property of nested object from MongoDB document?

I have a MongoDB document like this :
{
"_id": ObjectId("5589044a7019e802d3e9dbc5"),
"sessionId": LUUID("f49d4280-ced0-9246-a3c9-a63e68e1ed45"),
"teamId": LUUID("6ef7d1a8-f842-a54c-bd8c-daf6481f9cfc"),
"variableId": LUUID("59d1b512-eee2-6c4b-a5b5-dda546872f55"),
"values": {
"725400": 691.0000000000000000,
"725760": 686.0000000000000000,
"726120": 683.0000000000000000,
"726480": 681.0000000000000000,
"726840": 679.0000000000000000,
"727200": 678.0000000000000000,
"727560": 677.0000000000000000,
"727920": 676.0000000000000000
},
"variableType": 2,
"isSet": false,
"teamNumber": 2,
"simPageIds": []
}
I have a scenario that I have to delete a particular property from the "values" property of my document. for example, I want to delete value "727920" from the "values" property.
Since "Values" is not an array, I can't use $pull here. What I need is to remove
"727920" : 676.0000000000000000 from "values".
What is the right way to do that?
Use $unset as below :
db.collectionName.update({},{"$unset":{"values.727920":""}})
EDIT
For updating multiple documents use update options like :
db.collectionName.update({},{"$unset":{"values.727920":""}},{"multi":true})
You may try the following query using $unset
For single document update,
db.collectionName.update({ /* filter condition */ }, { $unset : { "ParentKey.ChildKey" : 1} })
For multiple documents update,
db.collectionName.updateMany({ /* filter condition */ }, { $unset : { "ParentKey.ChildKey" : 1} })

Casbah MongoDB, how to both add and remove values to an array in a single operation, to multiple documents?

After searching, I was unable to figure out how to perform multiple updates to a single field.
I have a document with a "tags" array field. Every document will have random tags before I begin the update. In a single operation, I want to add some tags and remove some tags.
The following update operator returns an error "Invalid modifier specified: $and"
updateOperators: { "$and" : [
{ "$addToSet" : { "tags" : { "$each" : [ "tag_1" , "tag_2"]}}},
{ "$pullAll" : { "tags" : [ "tag_2", "tag_3"]}}]}
collection.update(query, updateOperators, multi=true)
How do I both add and remove values to an array in a single operation, to multiple documents?
You don't need the $and with the update query, but you cannot update two fields at the same time with an update - as you would see if you tried the following in the shell:
db.test.update({}, { "$addToSet" : { "tags" : { "$each" : [ "tag_1" , "tag_2"]}},
"$pullAll" : { "tags" : [ "tag_2", "tag_3"] }}, true, false)
You would get a Cannot update 'tags' and 'tags' at the same time error message. So how to achieve this? Well with this schema you would need to do it in multiple operations, you could use the new bulk operation api as shown below (shell):
var bulk = db.coll.initializeOrderedBulkOp();
bulk.find({ "tags": 1 }).updateOne({ "$addToSet": { "$each" : [ "tag_1" , "tag_2"]}});
bulk.find({ "tags": 1 }).updateOne({ "$pullAll": { "tags": [ "tag_2", "tag_3"] } });
bulk.execute();
Or in Casbah with the dsl helpers:
val bulk = collection.initializeOrderedBulkOperation
bulk.find(MongoDBObject("tags" -> 1)).updateOne($addToSet("tags") $each("tag_1", tag_2"))
bulk.find(MongoDBObject("tags" -> 1)).updateOne($pullAll("tags" -> ("tags_2", "tags_3")))
bulk.execute()
Its not atomic and there is no guarantee that nothing else will try to modify, but it is as close as you will currently get.
Mongo does atomic updates so you could just construct the tags you want in the array and then replace the entire array.
I would advise against using an array to store these values all together as this is an "unbound" array of tags. Unbound arrays cause movement on disk and that causes indexes to be updated and the OS and mongo to do work.
Instead you should store each tag as a seperate document in a different collection and "bucket" them based on the _id of the related document.
Example
{_id : <_id> <key> <value>} - single docuemnt
This will allow you to query for all the tags for a single user with db.collection.find({_id : /^<_id>/}) and bucket the results.

Append a string to the end of an existing field in MongoDB

I have a document with a field containing a very long string. I need to concatenate another string to the end of the string already contained in the field.
The way I do it now is that, from Java, I fetch the document, extract the string in the field, append the string to the end and finally update the document with the new string.
The problem: The string contained in the field is very long, which means that it takes time and resources to retrieve and work with this string in Java. Furthermore, this is an operation that is done several times per second.
My question: Is there a way to concatenate a string to an existing field, without having to fetch (db.<doc>.find()) the contents of the field first? In reality all I want is (field.contents += new_string).
I already made this work using Javascript and eval, but as I found out, MongoDB locks the database when it executes javascript, which makes the overall application even slower.
Starting Mongo 4.2, db.collection.updateMany() can accept an aggregation pipeline, finally allowing the update of a field based on its current value:
// { a: "Hello" }
db.collection.updateMany(
{},
[{ $set: { a: { $concat: [ "$a", "World" ] } } }]
)
// { a: "HelloWorld" }
The first part {} is the match query, filtering which documents to update (in this case all documents).
The second part [{ $set: { a: { $concat: [ "$a", "World" ] } } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set (alias of $addFields) is a new aggregation operator which in this case replaces the field's value (by concatenating a itself with the suffix "World"). Note how a is modified directly based on its own value ($a).
For example (it's append to the start, the same story ):
before
{ "_id" : ObjectId("56993251e843bb7e0447829d"), "name" : "London
City", "city" : "London" }
db.airports
.find( { $text: { $search: "City" } })
.forEach(
function(e, i){
e.name='Big ' + e.name;
db.airports.save(e);
}
)
after:
{ "_id" : ObjectId("56993251e843bb7e0447829d"), "name" : "Big London
City", "city" : "London" }
Old topic but i had the same problem.
Since mongo 2.4, you can use $concat from aggregation framework.
Example
Consider these documents :
{
"_id" : ObjectId("5941003d5e785b5c0b2ac78d"),
"title" : "cov"
}
{
"_id" : ObjectId("594109b45e785b5c0b2ac97d"),
"title" : "fefe"
}
Append fefe to title field :
db.getCollection('test_append_string').aggregate(
[
{ $project: { title: { $concat: [ "$title", "fefe"] } } }
]
)
The result of aggregation will be :
{
"_id" : ObjectId("5941003d5e785b5c0b2ac78d"),
"title" : "covfefe"
}
{
"_id" : ObjectId("594109b45e785b5c0b2ac97d"),
"title" : "fefefefe"
}
You can then save the results with a bulk, see this answer for that.
this is a sample of one document i have :
{
"_id" : 1,
"s" : 1,
"ser" : 2,
"p" : "9919871172",
"d" : ISODate("2018-05-30T05:00:38.057Z"),
"per" : "10"
}
to append a string to any feild you can run a forEach loop throught all documents and then update desired field:
db.getCollection('jafar').find({}).forEach(function(el){
db.getCollection('jafar').update(
{p:el.p},
{$set:{p:'98'+el.p}})
})
This would not be possible.
One optimization you can do is create batches of updates.
i.e. fetch 10K documents, append relevant strings to each of their keys,
and then save them as single batch.
Most mongodb drivers support batch operations.
db.getCollection('<collection>').update(
// query
{},
// update
{
$set: {<field>:this.<field>+"<new string>"}
},
// options
{
"multi" : true, // update only one document
"upsert" : false // insert a new document, if no existing document match the query
});