MongoDB C# Driver Update Collection with Concatenated string - mongodb

How do I convert this SQL to MongoDB query using C# Driver
UPDATE dbo.MyTable SET ConcatField = CONCAT(Field1, Field2, Field3, Field4, Field5)
WHERE Id = 21
Using MongoDB.Driver 2.2.3.3
I need MongoDB query using BsonDocument type, I don't have Strong types for my Mongo Collections as Collection is not based on fixed schema.
Trying something like this
var items = myCollection.FindSync(filter).ToList();
foreach (var item in items)
{
UpdateDefinition<BsonDocument> updateDefinition =
new BsonDocumentUpdateDefinition<BsonDocument>(item.Merge(ListOfStrinForSelectedFields.ToBsonDocument()));
myCollection.UpdateManyAsync(filter, updateDefinition);
}

This will be my Shell Script
var cursor = db.MyCollection.find({ "Id": 21 }), // Or what ever your find conditions is
bulkUpdateOps = [];
cursor.forEach(function(doc){
var ConcatField = doc.Field1 + doc.Field2 + doc.Field3 ;
bulkUpdateOps.push({
"updateOne": {
"filter": { "_id": doc._id },
"update": { "$set": { "MyConCatField": ConcatField } }
}
});
if (bulkUpdateOps.length == 1000) {
db.MyCollection.bulkWrite(bulkUpdateOps);
bulkUpdateOps = [];
}
});
if (bulkUpdateOps.length > 0) { db.MyCollection.bulkWrite(bulkUpdateOps); }
then execute it in c# with RunCommandAsync method from MongoDatabase.
var result = await mongoDatabase.RunCommandAsync<BsonDocument>(BsonDocument.Parse(command));
Note: you will have to modify the command string using pipelines and parse it to BsonDocument.

Related

Build predicates for a postgres jsonb column with criteria builder exact match using JPA criteria

private void teamsCriteria(Root<Employee> root, CriteriaBuilder criteriaBuilder, List<Predicate> predicates) {
var teamsPredicateArr = new Predicate[filters.getTeams().size()];
for (var i = 0; i < filters.getTeams().size(); i++) {
teamsPredicateArr[i]=criteriaBuilder.like(criteriaBuilder.concat(root.get(teams), \\:\\:text), "%" + filters.getTeams().get(i) + "%");
}
var predicate = criteriaBuilder.or(teamsPredicateArr);
predicates.add(criteriaBuilder.and(predicate));
}
Example: I have jsonb column teams
{
"team": [
"DEFAULT"
]
}
{
"team": [
"EF"
]
}
If I execute above code I am getting both the teams
I want exact match of jsonb column value:
Expected result :I have to filter only "EF"

Not iterable when using find ObjectId

I'm trying to find a certain document in my mongodb then update the int value of it using find query, I'm using $in because I used an array to find each element inside it, but when I used ObjectId it gives me error:
bloodinventoryDocs is not iterable
Here is what I did
var mongoose = require('mongoose');
var id = mongoose.Types.ObjectId('5c014c999cc48c3b0057988b');
var newValue = 1;
var newBloodgroup = "A_positive";
var newGetbloodcomponent = "Whole Blood";
Bloodinventory.find({ blood_component : { $in : newGetbloodcomponent} , blood_group: { $in :newBloodgroup},chapter: { $in :id}}, function(err, bloodinventoryDocs) {
for(let bloodinventory of bloodinventoryDocs) {
bloodinventory.num_stock = bloodinventory.num_stock + newValue ;
bloodinventory.save(function(err) {
if (err) {
console.log(err);
} else {
console.log('success');
}
});
}
});
Just use chapter: { $in: [id] }

Mongo Db bulk update operation for data type conversion

How to perform bulk operation for mongo db? I already have script to change data type but it is not considering huge volume.
Collection 'store' has column called 'priceList' which is Array having multiple fields one of which is 'value'. Right now it is integer and now I want to convert it to custom record object.
Current schema
store
- _id
- name [String]
- priceList [Array]
- amount [Record] //{"unscaled":<value>, "scaled", <value>}
- value [Integer]
Need to convert value to [Record] as mentioned in above format
For e.g:- value: 2 will become value: {"unscaled":2, "scaled", 0};
db.store.find({priceList: { $exists : true}}).forEach(function(obj){
obj.priceList.forEach(function(y){
y.value = ({"unscaled":NumberLong(y.value),"scaled",NumberInt(0)});
db.store.save(obj);
})
});
Thanks!!
you try like this,
db.store.find({
priceList: {
$exists: true
}
}).forEach(function(myDoc) {
var child = myDoc.priceList;
for (var i = 0; i < child.length; i++) {
var ob = child[i];
var obj = {"unscaled":NumberLong(ob.value),"scaled":NumberInt(0)};
if ('value' in ob) {
ob.value = obj;
child[i] = ob;
}
}
db.store.update({
_id: myDoc._id
}, {
$set: {
subMenu: child
}
});
});
Hope this helps (updated) !
db.store.find({priceList: {$exists: true}})
.toArray()
.forEach(o => db.store.update(
{_id: o._id},
{ $set: {
priceList: o.priceList.map(l => Object.assign(l, {
amount: {
unscaled: l.value,
scaled: 0
}
}))
}
}
))

How should i update documents, each with different update data set, in mongodb collections

I have mongodb in which there is 3 huge collections say 'A', 'B' and 'C'
Each collection contains about 2 million documents.
There are certain properties for each of the document.
Each document need to be updated based on those values of certain properties, from which i can determine what should be the '$set' to that document.
currently i am using the same approach for each collection.
that to find all documents in batches. collection them in memory (which i think the culprit for the current approach), then one by one update them all.
For the first collection(that have similar data as in other collections), it takes 10 minutes to get completed. then the next two collections taking 2 hours approx to get the task done or mongodb client get crashed earlier.
There is something wrong and no desired in the current approach.
Model.collection.find({}).batchSize(BATCH).toArray(function(err, docs){
if(err || !docs || !docs.length)
return afterCompleteOneCollection(err);
var spec = function(index) {
if(index % 1000 === 0) console.log('at index : ' + index);
var toSet = { };
var toUnset = { };
var over = function(){
var afterOver = function(err){
if(err) return afterCompleteOneCollection(err);
if(index < docs.length - 1) spec(index+1);
else afterCompleteOneCollection(null);
};
var sb = Object.keys(toSet).length;
var ub = Object.keys(toUnset).length;
if(sb || ub) {
var all = {};
if(sb) all.$set = toSet;
if(ub) all.$unset = toUnset;
Model.collection.update({ _id : docs[index]._id }, all, {}, afterOver);
} else afterOver(null);
};
forEachOfDocument(docs[index], toSet, toUnset, over);
};
spec(0);
});
Is there any better solution for the same.?
The streaming approach from here http://mongodb.github.io/node-mongodb-native/api-generated/cursor.html#stream worked for me
This is what i am doing :
var stream = Model.collection.find().stream();
stream.on('data', function(data){
if(data){
var toSet = { };
var toUnset = { };
var over = function(){
var afterOver = function(err){
if(err) console.log(err);
};
var sb = Object.keys(toSet).length;
var ub = Object.keys(toUnset).length;
if(sb || ub) {
var all = {};
if(sb) all.$set = toSet;
if(ub) all.$unset = toUnset;
Model.collection.update({ _id : data._id }, all, {}, afterOver);
} else afterOver(null);
};
forEachOfDocument(data, toSet, toUnset, over);
}
});
stream.on('close', function() {
afterCompleteOneCollection();
});

Is it possible to Rename the output key in Mongo's MapReduce Result?

I am trying to perform an Inline mapreduce operation using pyMongo.
The code looks like this:
import pymongo
from bson.code import Code
con = pymongo.MongoClient()
map_func = Code("""
function() {
var all = this.members.concat(this.admins)
var group_id = this._id
all.forEach(function(_id) {
emit(_id, [group_id])
})
}
""")
reduce_func = Code("""
function(key, values) {
var ob = {};
ob[key] = [];
for (var i=0; i<values.length; i++) {
ob[key].push(values[i][0])
}
return ob
}
""")
finalize_func = Code("""
function(key, value) {
if (typeof(value.push) == "function") {
return value
} else {
return value[key]
}
}
""")
result = con.test.group.inline_map_reduce(
map_func,
reduce_func,
finalize=finalize_func)
import pprint; pprint.pprint(result)
Output for this operation is:
[{u'_id': u'135348133252952338363702',
u'value': [u'135457069105859781018098',
u'135661481520484615218098',
u'135391961249458761918098',
u'135758863859275369318098',
u'135156779012512657918098',
u'135285081801846289218098',
u'136040996346306049718098',
u'136237587535011048218098',
u'136862399436556328318098']},
{u'_id': u'136068596781820946163702',
u'value': [u'136068597966313224518098',
u'135156779012512657918098',
u'136415311739865096818098']}]
Is there any hook/operator by which I can rename the output field to any custom string example "group_id" instead of "values" ?
I have read Mongo's MapReduce documentation, but did not find any tip on how to accomplish this.
Using the MapReduce finalize hook described here, I reformatted my output to the field names I wanted:
db.collection.mapReduce(
mapFunction(),
reduceFunction(),
{
query: {params},
finalize: function(key, reduced_value) {
return {
my_field: key,
my_value: reduced_value
}
}
}
)
No, the reduced values are always in a field named value.