MongoDB requests very slow to return results - mongodb

So, I only have a few documents in my Mongo DB. For example, I have this basic find request (see below) which takes 4 seconds to return a 1.12KB JSON, before the component re-render.
app.get('/mypath', (req, res) => {
MongoClient.connect(urlDb, (err, db) => {
let Mycoll = db.collection('Mycoll');
Mycoll.find({}).toArray( (err, data) => {
if (err) throw err;
else{
res.status(200).json(data);
}
})
db.close();
})
});
Sometimes for that same component to re-render, with the same request, it takes 8 seconds (which equals an eternity for an Internet user).
Is it supposed to take this long ? I can imagine a user of my app starting to think ("well, that doesn't work") and close it right before the results show.
Is there anything you could point me to to optimize the performance ? Any tool you would recommend to analyze what exactly causes this bottleneck ? Or anything I did wrong ?
At this stage, I don't incriminate React/Redux, because with no DB requests involved, my other components render fast.

Related

Why use useMutation in React Query?

I am new to react query.
I found out that using useQuery can reduce requests by caching.
But it's hard to understand why I use useMutation.
axios.post('/user', { name, ... })
.then(res => console.log(res))
.catch(err => console.log(err);
const { mutate, isLoading, ... } = useMutation(fetcher, {
onSuccess: (res) => {
console.log(res);
},
onError: (err) => {
console.log(err);
}
});
Both codes handle successful requests and errors.
Isn't queryClient.invalidateQueries('queryKey'); also covered by axios then() function?
What's the difference between the two?
It's a difference in what happens to the backend state. In a query the intention is that you're requesting a particular dataset from some source. The request does not change any backend state, and any re-requests for the data will also not cause a change to backend state.
In a mutation the intention is to create some change in the backend state. (e.g. creating a new record in a database, or updating an existing record).
It's the equivalent of a read (query) vs write (mutation) operation
At least useMutations provides a lot of useful stuff like loading state, error, etc. under the hood.

Download large sets of documents from MongoDB using Meteor methods

I am trying to export all the documents from a collection (which is about 12 MB) using a Meteor method but it is almost always crashing the app or never returning the results.
I am considering to upload the documents to S3 then sending a download link to the client, however it seems like having an unnecessary network connections and will make the process even longer.
Is there a better way to get large sets of data from server to client?
here is the example of that code, it is very simple.
'downloadUserActions': () => {
if (Roles.userIsInRole(Meteor.userId(), ['admin'])) {
const userData = userActions.find({}).fetch();
return userData
}
}
Thanks.
You can use an approach, where you split the requests into multiple ones:
get the document count
until document count is completely fetched
get the current count of already fetched docs
fetch the next bunch of docs and skip already fetched ones
For this you need the skip option in the mongo query in order to skip the already fetched docs.
Code example
const limit = 250
Meteor.methods({
// get the max amount of docs
getCount () {
return userActions.find().count()
},
// get the next block of docs
// from: skip to: skip + limit
// example: skip = 1000, limit = 500 is
// from: 1000 to: 1500
downloadUserActions (skip) {
this.unblock()
return userActions.find({}, { skip, limit }).fetch()
}
})
Client:
// wrap the Meteor.call into a promise
const asyncCall = (name, args) => new Promise((resolve, reject) => {
Meteor.call(name, args, (err, res) => {
if (err) {
return reject(err)
}
return resolve(res)
})
})
const asyncTimeout = ms => new Promise(resolve => setTimeout(() => resolve(), ms)
const fetchAllDocs = async (destination) => {
const maxDocs = await asyncCall('getCount')
let loadedDocs = 0
while (loadedDocs < maxDocs) {
const docs = await asyncCall('downloadUserActions', loadedDocs)
docs.forEach(doc => {
// think about using upsert to fix multiple docs issues
destination.insert(doc)
})
// increase counter (skip value)
loadedDocs = destination.find().count()
// wait 10ms for next request, increase if server needs
// more time
await asyncTimeout(10)
}
return destination
}
Use it with a local Mongo Collection on the client:
await fetchAllDocs(new Mongo.Collection(null))
After the function all docs are now stored in this local collection.
Play with the limit and the timeout (miliseconds) values in order to find a sweet-spot between user-experience and server-performance.
Additional improvements
The code does not authenticate or validate requests. This is up to you!
Aƶlso you might think about adding a failsafe-machanism in case the while loop never completes due to some unintended errors.
Further readings
https://docs.meteor.com/api/methods.html#DDPCommon-MethodInvocation-unblock
https://docs.meteor.com/api/collections.html#Mongo-Collection
https://docs.meteor.com/api/collections.html#Mongo-Collection-find

NodeJs api with mongoose any request go to another collection efficient

I'm write nodeJs api for app with users and I want for each user to use another mongo collection.
I recognize each user with the URL address in the params fields.
everything is work fine.
But when I go to collection dynamically it's very slow.
any one idea how to do this faster?
thanks in advance ;)
app.js
this code do req in 2.5 seconds
POST /login 200 2487.531 ms - 206
app.use("/:businessUrl", (req, res, next) => {
console.log(req.params.businessUrl);
mongoose
.connect(process.env.MONGO_URI + `${req.params.businessUrl}retryWrites=true&w=majority`,)
.then((result) => {
console.log("DB Connected");
next();
})
.catch((err) => {
return next(err);
});
});
and this code when the collection is hard coded
do the same req in 0.5 seconds
POST /login 200 461.829 ms - 206
mongoose .connect(process.env.MONGO_URI + `businessUrl?retryWrites=true&w=majority`)
.then((result) => {
console.log("DB Connected");
})
.catch((err) => {});
The latency is coming because you are creating a connection everytime the API is being hit.
As I can see from implementation, There is same server that you are using just switching the database.
So, You can use the useDB method provided by mongoose. It also has an option to maintain cache for the connection object for each DB.
Official Docs: https://mongoosejs.com/docs/api/connection.html#connection_Connection-useDb
Using above approach will only create connection to the database when the API is being hit first time but will resolve it from cache if we are hitting it after first time.

Get number of documents in a big collection in Cloud Firestore

I know this question was already asked but I'm being specific about my case: I've got a large database (approximately 1 million documents inside the collection users).
I wanna get the exact number of documents inside users. I'm trying this:
export const count_users = functions.https.onRequest((request, response) => {
corsHandler(request, response, () => {
db.collection('users').select().get().then(
(snapshot) => response.json(snapshot.docs.length)
)
.catch(function(error) {
console.error("[count_users] Error counting users: ", error);
response.json("Failed");
});
});
});
Although it seems right, it takes forever to give me a result. I'm not allowed to add or remove documents from the database.
Is there any possible approach for getting this quantity?

concurrency issues while upserting and then reading the data from mongodb using mongoose

Hi I am trying to build an application which upserts data and fetches from the mongodb baser on the userid.This approach works fine for a single user.But when i try hitting for multiple users say 25 the data fetched seems to be null. Below is my upsert code
collection.update({'USER_ID': passVal.ID},
{'RESPONSE': Data}, { upsert: true }, function (err) {
if (err) {
console.log("Error in saving data");
}
var query = collection.findOne({'USER_ID': passVal.ID});
query.select('RESPONSE');
query.exec(function (err, data) {
if (err) return handleError(err);
console.log(data.RESPONSE);
});
})
I always get an error insome cases as data is null.I have written the read code in the call back of upsert only.I am stuck here any help regarding this will be much helpful.