I am updating MongoDB document using a cursor loop. But in the below query cursor owner field cannot exist in the results. I am using below query but looks like results.hasOwnProperty('owner') always returning true and it is throwing error results.owner doesn't exists.
Even I tried db.contact_coolection.find({"_id" : ObjectId("5b876144d87b4d06ecb571b8")}) --> This gives ownerCRMContactId.$cond is not valid.
I think $cond is aggregator operator, what should be used in this case?
db.contact_coolection.find().forEach(function(results)
{
print( "Id: " + results._id);
db.contact_coolection.update( {_id : results._id},
{$set : {
"ownerCRMContactId": {
$cond: { if: results.hasOwnProperty('owner'), then: results.owner.$id, else: '' }
}
}
});
});
Below is the sample document doesn't containing owner
{
"_id" : ObjectId("5b876144d87b4d06ecb571b8"),
"_class" : "com.cfcf.crm.model.auth.CRMContact",
"crmId" : -09898,
"prefix" : "Mr",
"firstName" : "fghh",
"middleName" : "asdgasd",
"lastName" : "asdasd",
"suffix" : "asdassd",
"nickName" : "asdasd",
"gender" : "Male",
"age" : "0",
"dlNumber" : "0",
"height" : 0,
"weight" : 0,
"isSmoker" : false,
"deleted" : false,
"isEnabled" : false,
"externalSource" : [],
"familyMembers" : [],
"crmInfos" : [
{
"opportunityId" : "5b95fa6c28e76b60c0454a2e"
}
],
"createdDate" : ISODate("2018-08-30T03:15:16.181Z"),
"lastModifiedDate" : ISODate("2018-09-10T05:00:28.627Z"),
"createdBy" : "5b2f43e433d58d3e0cd15304",
"lastModifiedBy" : "5b2f43e433d58d3e0cd15304",
"owner" : {
"$ref" : "contact_coolection",
"$id" : ObjectId("5b2f43e433d58d3e0cd15303")
},
"roles" : [],
"links" : [
{
"linkId" : 0,
"linkTitle" : "testsLinkss",
"linkUrl" : "www.google.com"
}
],
"addresses" : [
{
"addressLine1" : "3rd Cross",
"addressLine2" : "Kensington Street",
"city" : "Birmingham",
"state" : "Alabama",
"country" : "US"
}
]
}
You can do this way
const promises = db.contact_coolection.find().forEach(async(results) => {
print( "Id: " + results._id);
await db.contact_coolection.update(
{ _id: results._id },
{ $set: {
ownerCRMContactId: results.owner ? results.owner.$id : ''
}}
)
})
await Promise.all(promises)
Update
const promises = db.contact_coolection.find().forEach(async(results) => {
print( "Id: " + results._id);
await db.contact_coolection.update(
{ _id: results._id },
{ $set: {
ownerCRMContactId: (results && results.owner) ? results.owner.$id : ''
}}
)
})
await Promise.all(promises)
Related
I don't quite figure out why I'm getting different results when:
> db.reference.find({"metadata.values": {address: {location: "barcelona"} } }).count();
0
> db.reference.find({"metadata.values.address.location": "barcelona"}).count();
1
Which is the difference?
The document contained into reference collection is:
{
"_id" : "Doc1Ref2",
"document" : "doc1",
"metadata" : [
{
"_id" : "Doc1Ref2Mdt1",
"user" : "user2",
"creationTimestamp" : ISODate("2018-09-24T12:20:56.169Z"),
"values" : {
"date" : ISODate("2018-09-24T12:20:56.171Z"),
"number" : 16,
"address" : {
"street" : "Av. Diagonal",
"location" : "barcelona"
},
"credentials" : [
{
"password" : "pwd",
"login" : "main"
},
{
"password" : "pwd",
"login" : "other",
"creation" : ISODate("2018-09-24T12:20:56.171Z")
}
],
"contact" : "contact name",
"tags" : [
"tag1",
"tag2"
]
}
}
],
"timestampCreation" : ISODate("2018-09-24T12:20:56.169Z")
}
The first query matches documents where metadata.values is an exact object {address: {location: "barcelona"} }, the second is where metadata.values has an object with address.location equal to "barcelona".
The equivalent tests in javascript:
if ((document.metadata || {}).values == {address: {location: "barcelona"} })
and
if ((((document.metadata || {}).values || {}).address || {}).location == "barcelona")
I have a JSON with two keys: id and name. I need a way to insert that id on my collection (querys.project.name) when the name on it and on the JSON matches.
Example of my JSON:
var projectsMysql = [
{
"id" : 1,
"name" : "Something"
},
{
"id" : 5,
"name" : "Something else"
},
{
"id" : 50,
"name" : "Some name"
}]
and in my collection about 60 documents like this one:
{
"_id" : ObjectId("58e42bf30a34d641be6c25c2"),
"folio" : "R-666-69",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Something else"
// "projectsMysql_id" : 5 THIS SHOULD BE PLACED HERE
},
}
I wrote a function for Node.js but now I need to run this directly on the mongo shell, I read that I need to use cursor.forEach() method but I couldn't find a way to do it, this is my function:
projectsMysql
.forEach(function(project){
var query = {
name: project.name
}
db.getCollection('requests')
.find(query)
.exec(function(err, response){
if (err) {
return
}
if (response) {
db.getCollection('requests')
.update({id: response.id}, {$set: {
'project.projectsMysql_id': project.id
console.log("element inserted");
}})
.exec(function(err){
});
}
});
});
Can you point me in to the right direction?
What you want here is bulkWrite(), where instead of actually executing an update() for each document within the projectsMysql array you actually just construct a "single" statement made of "multiple" updates which sends to the server and updates the appropriate documents:
db.getCollection('requests').bulkWrite(
projectsMysql.map(({ id, name }) =>
({ "updateOne": {
"filter": { "project.name": name },
"update": { "$set": { "project.id": id } }
}})
)
)
If you expect "multiple documents" to match the condition, then simply switch to updateMany:
db.getCollection('requests').bulkWrite(
projectsMysql.map(({ id, name }) =>
({ "updateMany": {
"filter": { "project.name": name },
"update": { "$set": { "project.id": id } }
}})
)
)
Your array is already in memory so there is not really any point to doing any other kind of "iteration" and you can simply .map() the properties onto the updateOne statements ( or updateMany ) and issue them all in one statement. Updates are only actually processed where there is a "match" and indeed where there is something to actually update, as existing values will be left alone using $set.
To Demonstrate, considering these documents:
{
"folio" : "R-666-69",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Something else"
}
},
{
"folio" : "R-666-67",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Some name"
}
},
{
"folio" : "R-666-68",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Some name"
}
}
{
"folio" : "R-666-64",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Something"
}
},
{
"folio" : "R-666-65",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Different"
}
}
The the following update:
var projectsMysql = [
{
"id" : 1,
"name" : "Something"
},
{
"id" : 5,
"name" : "Something else"
},
{
"id" : 50,
"name" : "Some name"
}]
db.getCollection('requests').bulkWrite(
projectsMysql.map(({ id, name }) =>
({ "updateMany": {
"filter": { "project.name": name },
"update": { "$set": { "project.id": id } }
}})
)
)
Returns the response:
{
"acknowledged" : true,
"deletedCount" : 0,
"insertedCount" : 0,
"matchedCount" : 4,
"upsertedCount" : 0,
"insertedIds" : {
},
"upsertedIds" : {
}
}
And alters the matched documents accordingly:
{
"_id" : ObjectId("5b206a48f7fa0c655d90157a"),
"folio" : "R-666-69",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Something else",
"id" : 5
}
}
{
"_id" : ObjectId("5b206a48f7fa0c655d90157b"),
"folio" : "R-666-67",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Some name",
"id" : 50
}
}
{
"_id" : ObjectId("5b206a48f7fa0c655d90157c"),
"folio" : "R-666-68",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Some name",
"id" : 50
}
}
{
"_id" : ObjectId("5b206a48f7fa0c655d90157d"),
"folio" : "R-666-64",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Something",
"id" : 1
}
}
{
"_id" : ObjectId("5b206a48f7fa0c655d90157e"),
"folio" : "R-666-65",
"alias_purchase" : "Deal",
"project" : {
"description" : "",
"name" : "Different"
}
}
I have a exemple data:
"_id" : ObjectId("5694ba11b3957b7ff69c4547"),
"name" : "Okas 1",
"job" : {
"name" : "job try1",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5c")
},
"categories" : {
"ss" : [
{
"name" : "10",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5c")
},
{
"name" : "50",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5f")
}
]
}
if I update with new data .
[{
"name" : "800",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5a")
},
{
"name" : "8",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5f")
}]
I should get the data
"_id" : ObjectId("5694ba11b3957b7ff69c4547"),
"name" : "Okas 1",
"job" : {
"name" : "job try1",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5c")
},
"categories" : {
"ss" : [
{
"name" : "10",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5c")
},
{
"name" : "8",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5f")
},
{
"name" : "800",
"_id" : ObjectId("5a6ff9f7a336e3bba40a1d5a")
}
]
}
So I want to update if data exeist just update value but if not exist
add new element in to array categories. I try but not working. the
results are not appropriate
You can do something like this:
model.findOne({_id:req.params.id}, (err, data) => {
if (err) throw err;
if(!data) {
var newData = new model({
name: req.body.name
.....
})
newData.save((err, newdata) => {
// Response
})
} else {
data.name = req.body.name
data.save((err, data) => {
// Response
})
}
I hope this is what you want.
Im trying to do a i migration in my MongoDB. I have updated a Field from Content -> StringContent. Now I want to update all records that exists with the new Field name.
This is how a document looks like.
{
"_id" : "c4af0b19-4c78-4e58-bbe5-ac9e5cce2c3f",
"Type" : "Onboarding",
"Cards" : [
{
"_id" : LUUID("3f328a1c-658d-ee4e-8f06-561760eb5be5"),
"Width" : 1,
"Title" : "",
"Type" : "Freetext",
"Description" : "",
"Content" : "This is a test" // -> "StringContent" : "This is a Test"
},
{
"_id" : LUUID("2f328a1c-658d-ee4e-8f06-561760eb5be5"),
"Width" : 1,
"Title" : "",
"Type" : "Freetext",
"Description" : "",
"Content" : "This is another test" //-> "StringContent" : "This is another Test"
}
],
"DocumentType" : "Template",
"History" : [
{
"Date" : ISODate("2017-07-13T12:03:01.620Z"),
"ByUserId" : LUUID("4ecaa6ca-2ce6-f84c-81f3-28f8f0256e6e")
}
],
"Name" : "Default Template"
}
I have created this script:
var bulk = db.getCollection('OnboardingPortal').initializeOrderedBulkOp(),
count = 0;
db.getCollection('OnboardingPortal').find({"DocumentType": "Template"}).
forEach(function(doc) {
doc.Cards.forEach(function(card) {
if(card.hasOwnProperty("Content")){
print(card);
bulk.find({"_id": doc._id, "Cards._id": card._id}).update(
{
$set: {"Cards.$.StringContent": card.Content}
});
bulk.find({"_id": doc._id, "Cards._id": card._id}).update(
{
$unset: {"Cards.$.Content": 1}
});
count += 2;
if(count % 500 == 0) {
bulk.execute();
bulk = db.getCollection('OnboardingPortal').initializeOrderedBulkOp();
}
}
});
});
if ( count % 500 !== 0 ){
bulk.execute();
}
This Does not update anything, but if I change bulk.operations -> to explicit set index on array like this, it will do the job. but only for one card :
bulk.find({"_id": doc._id, "Cards.1._id": card._id}).update(
{
$set: {"Cards.1.StringContent": card.Content}
});
bulk.find({"_id": doc._id, "Cards.1._id": card._id}).update(
{
$unset: {"Cards.1.Content": 1}
});
What am i missing in my script so this can iterates over several documents and change Content-> StringContent in each Card. ?
EDIT
I have added a bulk.getOperations(); in my script. this is what it returns. Should it not have replaced $ with index ?
/* 1 */
[
{
"originalZeroIndex" : 0.0,
"batchType" : 2.0,
"operations" : [
{
"q" : {
"_id" : "c4af0b19-4c78-4e58-bbe5-ac9e5cce2c3f",
"Cards._id" : "1c8a323f-8d65-4eee-8f06-561760eb5be5"
},
"u" : {
"$set" : {
"Cards.$.StringContent" : "This is a cool test"
}
},
"multi" : false,
"upsert" : false
},
{
"q" : {
"_id" : "c4af0b19-4c78-4e58-bbe5-ac9e5cce2c3f",
"Cards._id" : "1c8a323f-8d65-4eee-8f06-561760eb5be5"
},
"u" : {
"$unset" : {
"Cards.$.Content" : 1.0
}
},
"multi" : false,
"upsert" : false
},
{
"q" : {
"_id" : "c4af0b19-4c78-4e58-bbe5-ac9e5cce2c3f",
"Cards._id" : "1c8a322f-8d65-4eee-8f06-561760eb5be5"
},
"u" : {
"$set" : {
"Cards.$.StringContent" : "This is a test"
}
},
"multi" : false,
"upsert" : false
},
{
"q" : {
"_id" : "c4af0b19-4c78-4e58-bbe5-ac9e5cce2c3f",
"Cards._id" : "1c8a322f-8d65-4eee-8f06-561760eb5be5"
},
"u" : {
"$unset" : {
"Cards.$.Content" : 1.0
}
},
"multi" : false,
"upsert" : false
}
]
}
]
I am trying to port an existing SQL schema into Mongo.
We have document tables, with sometimes several times the same document, with a different revision but the same reference. I want to get only the latest revisions of the documents.
A sample input data:
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC305",
"code" : "305-D",
"title" : "Document 305",
"creationdate" : ISODate("2011-11-24T15:13:28.887Z"),
"creator" : "X"
},
{
"Uid" : "xxx",
"status" : "COMMENTED",
"reference" : "DOC306",
"code" : "306-A",
"title" : "Document 306",
"creationdate" : ISODate("2011-11-28T07:23:18.807Z"),
"creator" : "X"
},
{
"Uid" : "xxx",
"status" : "COMMENTED",
"reference" : "DOC306",
"code" : "306-B",
"title" : "Document 306",
"creationdate" : ISODate("2011-11-28T07:26:49.447Z"),
"creator" : "X"
},
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC501",
"code" : "501-A",
"title" : "Document 501",
"creationdate" : ISODate("2011-11-19T06:30:35.757Z"),
"creator" : "X"
},
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC501",
"code" : "501-B",
"title" : "Document 501",
"creationdate" : ISODate("2011-11-19T06:40:32.957Z"),
"creator" : "X"
}
Given this data, I want this result set (sometimes I want only the last revision, sometimes I want all revisions with an attribute telling me whether it's the latest):
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC305",
"code" : "305-D",
"title" : "Document 305",
"creationdate" : ISODate("2011-11-24T15:13:28.887Z"),
"creator" : "X",
"lastrev" : true
},
{
"Uid" : "xxx",
"status" : "COMMENTED",
"reference" : "DOC306",
"code" : "306-B",
"title" : "Document 306",
"creationdate" : ISODate("2011-11-28T07:26:49.447Z"),
"creator" : "X",
"lastrev" : true
},
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC501",
"code" : "501-B",
"title" : "Document 501",
"creationdate" : ISODate("2011-11-19T06:40:32.957Z"),
"creator" : "X",
"lastrev" : true
}
I already have a bunch of filters, sorting, and skip/limit (for pagination of data), so the final result set should be mindful of these constraints.
The current "find" query (built with the .Net driver), which filters fine but gives me all revisions of each document:
coll.find(
{ "$and" : [
{ "$or" : [
{ "deletedid" : { "$exists" : false } },
{ "deletedid" : null }
] },
{ "$or" : [
{ "taskid" : { "$exists" : false } },
{ "taskid" : null }
] },
{ "objecttypeuid" : { "$in" : ["xxxxx"] } }
] },
{ "_id" : 0, "Uid" : 1, "lastrev" : 1, "title" : 1, "code" : 1, "creator" : 1, "owner" : 1, "modificator" : 1, "status" : 1, "reference": 1, "creationdate": 1 }
).sort({ "creationdate" : 1 }).skip(0).limit(10);
Using another question, I have been able to build this aggregation, which gives me the latest revision of each document, but with not enough attributes in the result:
coll.aggregate([
{ $sort: { "creationdate": 1 } },
{
$group: {
"_id": "$reference",
result: { $last: "$creationdate" },
creationdate: { $last: "$creationdate" }
}
}
]);
I would like to integrating the aggregate with the find query.
I have found the way to mix aggregation and filtering:
coll.aggregate(
[
{ $match: {
"$and" : [
{ "$or" : [
{ "deletedid" : { "$exists" : false } },
{ "deletedid" : null }
] },
{ "$or" : [
{ "taskid" : { "$exists" : false } },
{ "taskid" : null }
] },
{ "objecttypeuid" : { "$in" : ["xxx"] } }
]
}
},
{ $sort: { "creationdate": 1 } },
{ $group: {
"_id": "$reference",
"doc": { "$last": "$$ROOT" }
}
},
{ $sort: { "doc.creationdate": 1 } },
{ $skip: skip },
{ $limit: limit }
],
{ allowDiskUse: true }
);
For each result node, this gives me a "doc" node with the document data. It has too much data still (it's missing projections), but it's a start.
Translated in .Net:
FilterDefinitionBuilder<BsonDocument> filterBuilder = Builders<BsonDocument>.Filter;
FilterDefinition<BsonDocument> filters = filterBuilder.Empty;
filters = filters & (filterBuilder.Not(filterBuilder.Exists("deletedid")) | filterBuilder.Eq("deletedid", BsonNull.Value));
filters = filters & (filterBuilder.Not(filterBuilder.Exists("taskid")) | filterBuilder.Eq("taskid", BsonNull.Value));
foreach (var f in fieldFilters) {
filters = filters & filterBuilder.In(f.Key, f.Value);
}
var sort = Builders<BsonDocument>.Sort.Ascending(orderby);
var group = new BsonDocument {
{ "_id", "$reference" },
{ "doc", new BsonDocument("$last", "$$ROOT") }
};
var aggregate = coll.Aggregate(new AggregateOptions { AllowDiskUse = true })
.Match(filters)
.Sort(sort)
.Group(group)
.Sort(sort)
.Skip(skip)
.Limit(rows);
return aggregate.ToList();
I'm pretty sure there are better ways to do this, though.
You answer is pretty close. Instead of $last, $max is better.
About $last operator:
Returns the value that results from applying an expression to the last document in a group of documents that share the same group by a field. Only meaningful when documents are in a defined order.
Get the last revision in each group, see code below in mongo shell:
db.collection.aggregate([
{
$group: {
_id: '$reference',
doc: {
$max: {
"creationdate" : "$creationdate",
"code" : "$code",
"Uid" : "$Uid",
"status" : "$status",
"title" : "$title",
"creator" : "$creator"
}
}
}
},
{
$project: {
_id: 0,
Uid: "$doc.Uid",
status: "$doc.status",
reference: "$_id",
code: "$doc.code",
title: "$doc.title",
creationdate: "$doc.creationdate",
creator: "$doc.creator"
}
}
]).pretty()
The output as your expect:
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC501",
"code" : "501-B",
"title" : "Document 501",
"creationdate" : ISODate("2011-11-19T06:40:32.957Z"),
"creator" : "X"
}
{
"Uid" : "xxx",
"status" : "COMMENTED",
"reference" : "DOC306",
"code" : "306-B",
"title" : "Document 306",
"creationdate" : ISODate("2011-11-28T07:26:49.447Z"),
"creator" : "X"
}
{
"Uid" : "xxx",
"status" : "ACCEPTED",
"reference" : "DOC305",
"code" : "305-D",
"title" : "Document 305",
"creationdate" : ISODate("2011-11-24T15:13:28.887Z"),
"creator" : "X"
}