with mongo db node driver v4.13, how can I load a mongo collection without creating it if not existing?
In earlier versions the function db.collection can be called like this:
db.collection('not_existing', { strict: true }, (err, res) => {
if (err) {
console.log('Collection does not exist');
}
});
But in v4.13 the callback version of this function does not exist anymore and strict: true seems to be ignored.
const collection = await db.collection('not_existing', { strict: true });
console.log(await db.listCollections().toArray()); // lists the collection
You can use
db.getCollectionNames().filter(x => x == 'not_existing').length > 0
or
db.runCommand({ listCollections: 1, filter: { name: 'not_existing' } }).cursor.firstBatch.length > 0
to check whether a collection exists or not.
I'm trying to implement database population by using a migration function. The code works perfectly, it saves all the data into the database, but the test for the function is failing, and now I would like to know why?
I'm getting the "Exceeded timeout of 5000 ms" error for this particular test. I've written 166 tests for this app and all of them are passing.
Here is the function I want to test:
const doMigration = async ({ model, data }) => {
await model.collection.insertMany(data)
}
And here is the test:
const { Amodel } = require('../../../models/Amodel')
const { doMigration } = require('../../../database/migrations')
describe('Database Population', () => {
it ('Should populate the database using migrations', async () => {
const data = [{ name: 'A' }, { name: 'B' }]
const model = Amodel
const migration = { name: 'Amodel', model, data }
await doMigration(migration)
const countAfter = await Amodel.count()
expect(countAfter).toBe(2)
})
})
In this test I simply import the function, the model and create a migration object that then is passed to the function.
What did I try?
Tried using just the countAfter without using the doMigration function, and it still generates the same timeout error.
Tried increasing the time for this test to 30000, failed with error saying that the mongodb time exceeded the 10000 ms.
Here is the github repository: https://github.com/Elvissamir/Fullrvmovies
What is happening, how can I solve this error?
The problem was the way the mongodb connection was handled. When testing, the app created a connection to the db on startup, and then the jest tests used that connection, that caused some issues.
The solution was to connect to the database on startup only if the environment is set to testing, otherwise the connection will be handled by each set of tests.
In each set I added a beforeAll and afterAll to open and close the connection to the database.
Hope it helps anyone that finds the same problem or has similar issues.
The orientation is that the message reflect the actual reason, So i recommand to follow the following steps:
use the following code to check mongo state:
const { MongoMemoryServer } = require("mongodb-memory-server");
const mongoose = require("mongoose");
(async () => {
mongod = await MongoMemoryServer.create();
const mongoUri = mongod.getUri();
await mongoose.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((result) => {
console.log(result.connection.readyState)
console.log(result.connection.host)
}).catch((err) => {
});;
})();
if you are using mongodb-memory-server add "testTimeout" attribute:
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": [
"./src/test/setup.ts"
],
"testTimeout": 15000
},
If all above still huppens check the time-out of all inter-test operation
We have a Loopback v3.8 application using MongoDB connector v3.1
It works fine in the environments running native MongoDB but now we would like to deploy to Azure and use Cosmos DB, which in theory should support all the native MongoDB commands.
The problem we're having is that PATCH operations (which I believe are mapped to Model.updateAttributes by Loopback) are not working.
This is the error we get:
Could not update Client. { Error: No Client found for id
592cc132a31109354c45d1d8 }
Loopback debug strings:
loopback:connector:mongodb updateAttributes +7ms Client 592cc132a31109354c45d1d8 { '$set': { loginDate:2017-06-02T12:30:18.201Z } }
loopback:connector:mongodb MongoDB: model=Client command=findAndModify +2ms [ { _id: 592cc132a31109354c45d1d8 },
[ [ '_id', 'asc' ] ],
{ '$set': { loginDate: 2017-06-02T12:30:18.201Z } },
{}, [Function] ]
loopback:connector:mongodb Result: +399ms { _t: 'FindAndModifyResponse', ok: 1, value: null, lastErrorObject: { n: 1, updatedExisting: false, value: null } }
loopback:connector:mongodb updateAttributes.callback +4ms Client 592cc132a31109354c45d1d8 null { _t: 'FindAndModifyResponse', ok: 1,
value: null,
lastErrorObject: { n: 1, updatedExisting: false, value: null } }
If we do a GET for that Client, using its Id, we get the correct response, so the Client document is there.
Can the Loopback MongoDB connector used for Cosmos DB?
Are we missing something that requires Loopback to work correctly with Cosmos DB?
Thanks.
This is because the underlying base for cosmosdb is no longer MongoDB, it simply allows you to access is using a familiar API. The MongoDB connector is not intended for use with Cosmos DB, I personally have been looking for a solution for my own use and came across the NPM package loopback-connector-cosmosdb which worked for some simple applications but is completely unsupported by it's developer and loopback.
UPDATE: I am using the 2.1 version on the driver, against 3.2
I have a node application that uses MongoDB. The problem I have is that if the MongoDB server goes down for any reason, the application doesn't reconnect.
To get this right, I based my tests on the code in this official tutorial.
var MongoClient = require('mongodb').MongoClient
, f = require('util').format;
MongoClient.connect('mongodb://localhost:27017/test',
// Optional: uncomment if necessary
// { db: { bufferMaxEntries: 3 } },
function(err, db) {
var col = db.collection('t');
setInterval(function() {
col.insert({a:1}, function(err, r) {
console.log("insert")
console.log(err)
col.findOne({}, function(err, doc) {
console.log("findOne")
console.log(err)
});
})
}, 1000)
});
The idea is to run this script, and then stop mongod, and then restart it.
So, here we go:
TEST 1: stopping mongod for 10 seconds
Stopping MongoDb for 10 seconds does the desired result: it will stop running the queries for those 10 seconds, and then will run all of them once the server is back ip
TEST 2: stopping mongod for 30 seconds
After exactly 30 seconds, I start getting:
{ [MongoError: topology was destroyed] name: 'MongoError', message: 'topology was destroyed' }
insert
{ [MongoError: topology was destroyed] name: 'MongoError', message: 'topology was destroyed' }
The trouble is that from this on, when I restart mongod, the connection is not re-establised.
Solutions?
Does this problem have a solution? If so, do you know what it is?
Once my app starts puking "topology was destroyed", the only way to get everything to work again is by restarting the whole app...
There are 2 connection options that control how mongo nodejs driver reconnects after connection fails
reconnectTries: attempt to reconnect #times (default 30 times)
reconnectInterval: Server will wait # milliseconds between retries
(default 1000 ms)
reference on mongo driver docs
Which means that mongo will keep trying to connect 30 times by default and wait 1 second before every retry. Which is why you start seeing errors after 30 seconds.
You should tweak these 2 parameters based on you needs like this sample.
var MongoClient = require('mongodb').MongoClient,
f = require('util').format;
MongoClient.connect('mongodb://localhost:27017/test',
{
// retry to connect for 60 times
reconnectTries: 60,
// wait 1 second before retrying
reconnectInterval: 1000
},
function(err, db) {
var col = db.collection('t');
setInterval(function() {
col.insert({
a: 1
}, function(err, r) {
console.log("insert")
console.log(err)
col.findOne({}, function(err, doc) {
console.log("findOne")
console.log(err)
});
})
}, 1000)
});
This will try 60 times instead of the default 30, which means that you'll start seeing errors after 60 seconds when it stops trying to reconnect.
Sidenote: if you want to prevent the app/request from waiting until the expiration of the reconnection period you have to pass the option bufferMaxEntries: 0. The price for this is that requests are also aborted during short network interruptions.
package.json: "mongodb": "3.1.3"
Reconnect existing connections
To fine-tune the reconnect configuration for pre-established connections, you can modify the reconnectTries/reconnectInterval options (default values and further documentation here).
Reconnect initial connection
For the initial connection, the mongo client does not reconnect if it encounters an error (see below). I believe it should, but in the meantime, I've created the following workaround using the promise-retry library (which uses an exponential backoff strategy).
const promiseRetry = require('promise-retry')
const MongoClient = require('mongodb').MongoClient
const options = {
useNewUrlParser: true,
reconnectTries: 60,
reconnectInterval: 1000,
poolSize: 10,
bufferMaxEntries: 0
}
const promiseRetryOptions = {
retries: options.reconnectTries,
factor: 1.5,
minTimeout: options.reconnectInterval,
maxTimeout: 5000
}
const connect = (url) => {
return promiseRetry((retry, number) => {
console.log(`MongoClient connecting to ${url} - retry number: ${number}`)
return MongoClient.connect(url, options).catch(retry)
}, promiseRetryOptions)
}
module.exports = { connect }
Mongo Initial Connect Error: failed to connect to server [db:27017] on first connect
By default the Mongo driver will try to reconnect 30 times, one every second. After that it will not try to reconnect again.
You can set the number of retries to Number.MAX_VALUE to keep it reconnecting "almost forever":
var connection = "mongodb://127.0.0.1:27017/db";
MongoClient.connect(connection, {
server : {
reconnectTries : Number.MAX_VALUE,
autoReconnect : true
}
}, function (err, db) {
});
With mongodb driver 3.1.10, you can set up your connection as
MongoClient.connect(connectionUrl, {
reconnectInterval: 10000, // wait for 10 seconds before retry
reconnectTries: Number.MAX_VALUE, // retry forever
}, function(err, res) {
console.log('connected')
})
You do not have to specify autoReconnect: true as that's the default.
It's happening because it might have crossed the retry connection limit. After number of retries it destroy the TCP connection and become idle. So for it increase the number of retries and it would be better if you increase the gap between connection retry.
Use below options:
retryMiliSeconds {Number, default:5000}, number of milliseconds between retries.
numberOfRetries {Number, default:5}, number of retries off connection.
For more details refer to this link https://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
Solution:
MongoClient.connect("mongodb://localhost:27017/integration_test_?", {
db: {
native_parser: false,
retryMiliSeconds: 100000,
numberOfRetries: 100
},
server: {
socketOptions: {
connectTimeoutMS: 500
}
}
}, callback)
Behavior may differ with different versions of driver. You should mention your driver version.
driver version : 2.2.10 (latest)
mongo db version : 3.0.7
Below code will extend the time mongod can take to come back up.
var MongoClient = require('mongodb').MongoClient
, f = require('util').format;
function connectCallback(err, db) {
var col = db.collection('t');
setInterval(function() {
col.insert({a:1}, function(err, r) {
console.log("insert")
console.log(err)
col.findOne({}, function(err, doc) {
console.log("findOne")
console.log(err)
});
})
}, 1000)
}
var options = { server: { reconnectTries: 2000,reconnectInterval: 1000 }}
MongoClient.connect('mongodb://localhost:27017/test',options,connectCallback);
2nd argument can be used to pass server options.
If you was using Mongoose for your Schemas, it would be worth considering my option below since mongoose was never retrying to reconnect to mongoDB implicitly after first attempt failed.
Kindly note I am connecting to Azure CosmosDB for MongoDB API. On yours maybe on the local machine.
Below is my code.
const mongoose = require('mongoose');
// set the global useNewUrlParser option to turn on useNewUrlParser for every connection by default.
mongoose.set('useNewUrlParser', true);
// In order to use `findOneAndUpdate()` and `findOneAndDelete()`
mongoose.set('useFindAndModify', false);
async function mongoDbPool() {
// Closure.
return function connectWithRetry() {
// All the variables and functions in here will Persist in Scope.
const COSMODDBUSER = process.env.COSMODDBUSER;
const COSMOSDBPASSWORD = process.env.COSMOSDBPASSWORD;
const COSMOSDBCONNSTR = process.env.COSMOSDBCONNSTR;
var dbAuth = {
auth: {
user: COSMODDBUSER,
password: COSMOSDBPASSWORD
}
};
const mongoUrl = COSMOSDBCONNSTR + '?ssl=true&replicaSet=globaldb';
return mongoose.connect(mongoUrl, dbAuth, (err) => {
if (err) {
console.error('Failed to connect to mongo - retrying in 5 sec');
console.error(err);
setTimeout(connectWithRetry, 5000);
} else {
console.log(`Connected to Azure CosmosDB for MongoDB API.`);
}
});
};}
You may decide to export and reuse this module everywhere you need to connect to db via Dependency Injection. But instead I will only show how to access the database connection for now.
(async () => {
var dbPools = await Promise.all([mongoDbPool()]);
var mongoDbInstance = await dbPools[0]();
// Now use "mongoDbInstance" to do what you need.
})();
I have just pushed my web application on my prod server and I discover a surprising issue:
var username = 'foo';
var User = this.db.model('User', UserSchema);
User.findOne({ $or: [ { username: username }, { email: { value: username } } ] }, 'id', function(err, Doc) {
if (err) {
console.log(err);
}else if (Doc) {
console.log('OK');
}else {
console.log('Any result');
}
});
This exact same code works on my localhost but not on my prod server (I go in the else condition on my production server and in the else if (Doc) on localhost).
I print my variable username just before the findOne call and I checked manually, this username does exist.
If I drop the $or operator and only apply the condition on username, it works! Which is the proof the $or operator is responsible of this failure.
I update my node modules the same way on my both environments ("mongoose": ">=3.5.4").
My mongoDB version is the same on the both environments:
db version v2.0.4, pdfile version 4.5
git version: nogitversion
My localhost server: ubuntu.
My prod server: debian.
How could we explain that?
I have upgraded mongodb to the v2.2.3 on my prod server and it works...