When I try to add some Documents to a Collection, exactly 1 of 4 times I get an Error.
for (var i = 0; i < 50; i=i+1){
db.SampleOrder.insert(
{
"SampleId": NumberInt(i),
"PuckId": NumberInt(i)
});
}
Error:
Picture of the Error
Does anybody know why this doesn't work?
I use Robomongo Robo 3T 1.1.1.
you can use insertMany instead of insert to insert multiple document
like:
var docs = [];
for (var i = 0; i < 50; i=i+1){
docs.push({
"SampleId": NumberInt(i),
"PuckId": NumberInt(i)
});
}
db.SampleOrder.insertMany(docs);
Related
I have a Collection with a nested attribute that is an array of ObjectId References. These refer to documents in another Collection.
I'd like to replace these references with the documents themselves, i.e. embed those documents where the references are now. I've tried with and without the .snapshot() option. This may be caused because I'm updating a document while in a loop on that doc, and .snapshot() isn't available at that level.
My mongo-fu is low and I'm stuck on a call stack error. How can I do this?
Example code:
db.CollWithReferences.find({}).snapshot().forEach( function(document) {
var doc_id = document._id;
document.GroupsOfStuff.forEach( function(Group) {
var docsToEmbed= db.CollOfThingsToEmbed.find({ _id: { $in: Group.ArrayOfReferenceObjectIds }});
db.CollWithReferences.update({"_id": ObjectId(doc_id) },
{$set: {"Group.ArrayOfReferenceObjectIds ":docsToEmbed}} )
});
});
Gives this error:
{
"message" : "Maximum call stack size exceeded",
"stack" : "RangeError: Maximum call stack size exceeded" +
....}
I figure this is happening for one of two reasons. Either you are running out of memory by executing two queries in a for loop, or the update operation is being executed before the find operation has finished.
Either way, it is not a good idea to execute too many queries in a for loop as it can lead to this type of error.
I can't be sure if this will fix your problem as I don't know how many documents are in your collections, but it may work if you first get all documents from the CollWithReferences collection, then all you need from the CollOfThingsToEmbed collection. Then build a map of an _id from the CollOfThingsToEmbed collection to the actual document that corresponds to that. You can then loop through each document you got from the CollWithReferences collection, and mutate the groupsOfStuff array by accessing each ArrayOfReferenceObjectIds array and setting the ObjectId to the value that you have in the map you already built up, which will be the whole document. Then just update that document by setting GroupsOfSuff to its mutated value.
The following JavaScript code will do this (it could be organised better to have no logic in the global scope etc.):
var references = db.CollWithReferences.find({});
function getReferenceIds(references) {
var referenceIds = [];
for (var i = 0; i < references.length; i++) {
var group = references[i].GroupsOfStuff;
for (let j = 0; j < group.ArrayOfReferenceObjectIds; j++) {
referenceIds.push(group.ArrayOfReferenceObjectIds[j]);
}
}
return referenceIds;
}
function buildIdMap(docs) {
var map = {};
for (var i = 0; i < docs.length; i++) {
map[docs[i]._id.toString()] = docs[i];
}
return map;
}
var referenceIds = getReferenceIds(references);
var docsToEmbed = db.CollOfThingsToEmbed.find({_id: {$in: referenceIds}});
var idMap = buildIdMap(docsToEmbed);
for (var i = 0; i < references.length; i++) {
var groups = references[i].GroupsOfStuff;
for (var j = 0; j < groups.length; j++) {
refs = groups[j].ArrayOfReferenceObjectIds;
refs.forEach(function(ref) {
ref = idMap[ref.toString()];
});
}
db.CollWithReferences.update({
_id: ObjectId(ref._id)
}, {
$set: {GroupsOfStuff: groups}
});
}
It would be better if it was possible to just do one bulk update, but as each document needs to be updated differently, this is not possible.
I want to iterate through the MongoDB collection to get the chart labels but I get TypeError: undefined is not an object (evaluating 'teams[i].name') here is my code:
var teams = Teams.find();
var teamNames = [10];
for(i = 0; i < 10; i++)
{
teamNames.push(teams[i].name);
}
var chart = new Chart(canvas, {
type: 'bar',
data: {
labels: [teamNames]
....
Anyone any suggestions? I am running out of ideas.
Thank you in advance.
You can do this
var teamNames = Teams.find().map(
function(team){
return team.name;
}
)
teams must have a length of less than 10 items. If teams is [{name: "first"}], then teams[1] will return undefined and you will get that error. You can use:
for (let i = 0; i < teams.length; i++)
to solve this problem.
You can also map over the array to get specific properties:
labels: teams.map(team => team.name),
In Meteor, the Collection .find() function returns a cursor that you can then use to perform operations on collection items. In your case, you are treating the cursor as if it were an array which is incorrect. There are a few different ways that you can approach this.
1) Use .forEach() to iterate over the cursor.
var teamNames = [];
Teams.find().forEach(function (e) {
teamNames.push(e.name);
});
2) Use .fetch() to return all matching documents in an array, then iterate over that.
var teams = Teams.find().fetch();
var teamNames = [];
for(i = 0; i < teams.length; i++) {
teamNames.push(teams[i].name);
}
3) Use .map() to iterate over the collection calling the callback on all items and returning an array.
var teamNames = Teams.find().forEach(function (e) {
return e.name;
});
I'm currently having issues when querying for one of my Documents inside a Database through Meteor.
Using this line of code I'm trying to retrieve the next sequence number out of the DB. But it sometimes skips numbers randomly for some reason.
var col = MyCounters.findOne(type);
MyCounters.update(col._id, {$inc: {seq: 1}});
return col.seq;
Not getting any kind of errors server side.
Does anybody know what the issue might be?
I'm on Meteor 1.4+
====================
Update
I also update another Collection with the new value obtained from MyCounters collection, so it would be something like this:
var col = MyCounters.findOne(type);
MyCounters.update(col._id, {$inc: {seq: 1}});
var barId = col.seq;
// declare barObject + onInsertError
barObject.barId = barId;
// ...
FooCollection.insert(barObject, onInsertError);
And FooCollection ends up having skipped sequence numbers up to 5000 sometimes.
If you want increament at that item Document, you can use :
var col = MyCounters.findOne(type);
var valueOne = 1;
var nameItem = 'seq';
var inc = {};
inc[ nameItem ] = valueOne;
MyCounters.update({ _id: col._id }, { '$inc': inc } )
But if you want increament value from all Document from Collections MyCounters ( maks seq + 1 ) you can use :
var count = MyCounters.findOne({}, {sort:{seq:-1}}).seq;
count = count + 1;
MyCounters.update({_id:col._id}, {$set:{seq:count}})
I hope it work for you. Thanks
refer to : https://stackoverflow.com/a/33968766/4287229
My test collection has 56 entities in them. When the following script is executed the resulting collection has less entries than the original collection. The number varies for each run. What would cause this issue and is there a workaround for this?
var collectionToUpdate = 'testcollection';
var temporaryCollectionName = collectionToUpdate + '_old'
db.getCollection(collectionToUpdate).renameCollection(temporaryCollectionName);
var oldCollection = db.getCollection(temporaryCollectionName);
db.createCollection(collectionToUpdate);
var newCollection = db.getCollection(collectionToUpdate);
var count = 0;
oldCollection.find().forEach(
function (element) {
count++;
newCollection.insert(element)
}
);
print(count);
Versions used:
MongoDB - 3.2.8
RoboMongo - 0.9.0-RC10
db.runCommand({cloneCollection : "newdb.rep", from:"localhost:27017"})
I am connected to the remote Mongo instance now.
However I understand this is majorly used to copy a collection from remote to local and not the other way round
you could use this script, which will connect to local and remote database.
Some info here
Then iterate via collection and insert in bulks of 1000 documents.
var localConnection = connect("localhost:27017/myDatabase");
var destinationConnection = connect("localhost:27020/myDatabase");
var documentLimit = 1000;
var docCount = localConnection.find({}).count();
var chunks = docCount / documentLimit;
for (var i = 0; i <= chunks; i++) {
var bulk = destinationConnection.initializeUnorderedBulkOp();
localConnection .find({}).snapshot()
.limit(documentLimit).forEach(function (doc) {
bulk.insert(doc);
});
bulk.execute()
}
#profesor79's script doesn't work with the newer version of MongoDB and for collections containing more than 1000 documents without some changes. It doesn't use skip() so MongoDB throws error when it tries to override the first 1000 docs again.
I have used the following script successfully on a collection containing around 20000 documents.
var localConnection = connect(
"<db_user>:<db_password>#localhost:27017/<db_name>"
);
var destinationConnection = connect(
"<db_user>:<db_password>#<remote_host>:<remote_port>/<remote_db>"
);
var documentLimit = 1000;
var docCount = localConnection.collection_name.count();
var chunks = docCount / documentLimit;
for (var i = 0; i <= chunks; i++) {
var bulk = destinationConnection.collection_name.initializeUnorderedBulkOp();
localConnection.collection_name
.find({})
// after each iteration skip 1000, 2000 and so on...
.skip(i*documentLimit)
.limit(documentLimit)
.forEach(function(doc) {
bulk.insert(doc);
});
bulk.execute();
}
Provide your own local and destination connection strings and documentLimit (i.e. number of documents to be processed as a chunk) and it should work.