Extremely high loading times - Requests not running async. Mongoose - mongodb

Overview
I've built an application with Vue, Express and MongoDB (mongoose ORM).
On loading the landing page, a series of GET requests are made for various bits of data. The loading times are extremely high, I've recorded some times as high as 22s for a particular route. It's lead me to believe that my requests are running sequentially, despite specifying in my logic that everything should run async
I've tried reducing the size of the objects being returned from the requests as well as using the .lean() method. These attempts shaved off a couple of seconds, but the overall issue is not remotely sorted. Times are still stupid high. To give an example:
From This:
// Method to find all users
var users = await User.find({});
To:
// Method to find all users
var users = await User.find({}, "username, uid").lean();
On the page in question, there are about 5 main components. Each component is making a get request. One of these is a Chat Column and the code for it is as follows:
ChatCol.vue
beforeMount () {
this.$store.dispatch('retrieve_chat')
}
Store.js (am using Vuex store)
retrieve_chat (context) {
return new Promise((resolve, reject) => {
axios({
url: api.dev + 'api/v1/chat',
method: 'GET',
})
.then(res => {
context.commit('set_chat', res.data)
resolve(res);
}).catch(err => {
// alert(err)
reject(err);
})
})
},
Requests in this format are being made on all the components. About 5 of them in the page in question.
Backend / Server Code
To give some context into the requests being made.
The client will hit the route 'http://localhost:3000/api/v1/chat'
and the code that makes the request on the server is the following:
var Chat = require("../models/ChatMessage");
module.exports = {
// LIMIT CHAT TO 100 MESSAGES
async get_chat(req, res) {
Chat.find({}, function(err, messages) {
if (err) {
return res.status(500).send({
message: "Interval Server Error",
type: "MONGO_CHAT_DOCUMENT_QUERY",
err: err,
})
}
if (!messages) {
return res.status(400).send({
message: "Resource not found",
type: "MONGO_CHAT_DOCUMENT_QUERY",
details: "!messages - no messages found",
})
}
messages.reverse();
return res.status(200).json({
messages,
});
}).sort({"_id": -1}).limit(30);
},
}
If I look at the network tab on the chrome dev tools, this is how the requests appear. Apologies for the long winded post, I literally have no idea what is causing this
Important Note:
It was mentioned to me that mongodb has this feature where it locks when mutating the data, and I thought that might be the case, but there are no mutations taking place. It's just 3/4 get requests happening in parallel, albeit pretty big requests, but they shouldn't be taking as long as they are
Screenshot of the network tab:
(ignore the failed req, and some of the duplicate named requests)
StackOverflow sempais please help. It's a very big application and I don't know what the issue is exactly, so If I've missed out any details - Apologies, I'll clarify anything that needs clarity.

Large amount of base64 encoded data from a previously abandoned and poorly implemented image upload feature was being stored in each chat message as well as other places, causing large amounts of data to be loaded in and ultimately lead to huge loading times.
Thank you Neil Lunn.

Related

How to batch requests to the same URL without causing memory leaks

I have a system that processes images. Essentially, I provide an ID to it, and it fetches a source image, and then it begins performing transformations on it to resize and reformat it.
This system gets quite a bit of usage, and one of the things that I've noticed is that I tend to get many requests for the same ID simultaneously, but in different requests to the webserver.
What I'd like to do is "batch" these requests. For example, if there's 5 simultaneous requests for the image "user-upload.png", I'd like there to be only one HTTP request to fetch the source image.
I'm using NestJS with default scopes for my service, so the service is shared between requests. Requests to fetch the image are done with the HttpModule, which is using axios internally.
I only care about simultaneous requests. Once the request finishes, it will be cached, and that prevents new requests from hitting the HTTP url.
I've thought about doing something like this (Pseudocode):
#Provider()
class ImageFetcher {
// Store in flight requests as a map between id:promise
inFlightRequests = { }
fetchImage(id: string) {
if (this.inFlightRequests[id]) {
return this.inFlightRequests[id]
}
this.inFlightRequests[id] = new Promise(async (resolve, reject) => {
const { data } = await this.httpService.get('/images' + id)
// error handling omitted here
resolve(data)
delete inFlightRequests[id]
})
return this.inFlightRequests[id]
}
}
The most obvious issue I see is the potential for a memory leak. This is solveable with more custom code, but I thought I'd see if anyone has any suggestions for doing this without writing more code.
In particular, I've also thought about using an axios interceptor, but I'm not entirely sure how to handle that properly. Any pointers here would be really appreciated.

How to call multiple different apis using useEffect hook in React

I have a concern regarding calling the apis using axios in useEffect hook. For example, I have a react page where I am showing the all the list of consignments. A consignment can have a user from user list, a carrier from carrier list, an account from account list, a status from status list and a service from service list. All these lists are enumerated data. So, in this page where I have to get all the enumerated list before rendering the page, because in the react component, I have to display them as dropdown, so that users can apply filter on top of that. But getting the list of those enumerated data, I have to call the separate api. For example, getting users I have to call /users api and getting customers I have to call /customers api. My concern is do I need to call them using a single useEffect hook using the axios. Because I have to hit the server multiple times for getting those enumerated data. If the number of lists of enumerated data increases then my api request to the server will also increase. I don't know what is the best practice to deal with this kind of situation. Whether to create a single api so that the server is hit only once and all the enumerated data are returned or have the separate api and hit the server request to get separately to enumerated data. And hitting the server multiple times to get the enumerated data does it impact performance on the client-side I mean some memory leak? Just need some advice on that. Thanks in advance.
useEffect(() => {
const loadData = async () => {
try {
dispatch(getLoad(true));
const services = await Axios.get("/Services");
const customers = await Axios.get("/Accounts/Customers");
const resCarrier = await Axios.get("/Accounts/Carriers");
const resStatuses = await Axios.get("/Status");
setFilterData((prev) => ({
...prev,
services: services.data,
customers: customers.data,
carriers: resCarrier.data,
statuses: resStatuses.data,
}));
dispatch(getLoad(false));
} catch (error) {
dispatch(getLoad(false));
}
};
}, []);
You can use axios.all
axios.all([
axios.get(`/Services`),
axios.get(`/Accounts/Customers`),
axios.get(`/Accounts/Carriers`),
axios.get(`/Status`)
])
.then(axios.spread((services, customers, carriers, status) => {
setFilterData((prev) => ({
...prev,
services: services.data,
customers: customers.data,
carriers: carriers.data,
statuses: status.data,
}));
}));

self.addEventListener('fetch', function(e) { }) is not working

I have a doubt in PWA and will be glad if someone helps me with that. In my PWA I don't have any problem with storing static files like HTML, JS & CSS. But am facing Issues on dynamic data. i.e : my self.addEventListener('fetch', function(e) { }) is not getting called, but other functionalities are working fine i.e: 'install' and 'active' event.
to be more particular, I am using #angular/service-worker which worked fine but I created another sw file called sw.js. In my sw-js I'm listening to the events like 'install' 'active' and 'fetch'. My sw.js fetch is not getting called whereas the other two methods work well. But while fetching the ngsw-worker.js's fetch method alone gets called.
The thing I need is to make CRUD Operations in PWA with angular.
Thanks in advance!
You can do the dynamic caching like below , the service worker will intercept every request and add in to the cache.
self.addEventListener("fetch", function (event) {
event.respondWith(
caches.open("dynamiccache").then(function (cache) {
return fetch(event.request).then(function (res) {
cache.put(event.request, res.clone());
return res;
})
})
)
}
Note : You can't cache POST Requests
Can service workers cache POST requests?

How to send final response from findOne() callback?

I have a User controller that has a create method that checks the database for email and username uniqueness before creating the user (this is to work-around a bug in the mongodb adpater for SailsJS that doesn't honour the unique attribute flag - version 0.10.5).
The code looks like the following:
User.find({ email: req.body.email }, function (err, user) {
if(user) {
return res.badRequest('Unique email constraint. Email is already used.');
}
});
User.create(req.body).exec(function (err, user) {
// Code to catch and manage err or new user
}
What I expect is that if the email already exists in the database (mongodb), to send a 400 using res.badRequest(), then execution to end.
What happens is that the response is sent, but then control moves to User.create() - execution doesn't end. I suspect that return res.badRequest is returning control back to the calling function (User.findOne), and execution continues from there.
I tried using res.badRequest().end() but that leaves the client hanging (there is no response), and using res.end() after the return res.badRequest() generated 'header send' errors.
How do I have execution of this request end if an existing email is found?
First of all, your findOne is here a find. That's not related to your problem, but it is slightly confusing, and you should ensure you are getting data in the format you expect.
As for finishing the request after marking it bad, I have not used sails, but I was able to end execution in the past by using res.send(). EDIT: after looking at the docs, it seems this is done for you by .badRequest(), so ignore that part.
That said, even THAT is not actually your problem. Your problem is that you start an asynchronous User.find(), and then you immediately start running User.create() (also asynchronously), and so your request doesn't get marked bad until after you have already attempted to create a new user.
What you need to do is one of two things:
Use promises (NOTE: this is how it works for Mongoose; Sails may be different) to only run User.create() after User.find() has completed. e.g;
var userQuery = User.findOne({ email: req.body.email }).exec();
userQuery.addBack(function(err, user) {
if(!!user) res.badRequest('...');
else create_user();
});
Put your user creation logic inside of your findOne block. e.g.;
User.findOne({ email: req.body.email }, function(err, user) {
if (user) { // or perhaps you want if (!err)
User.create(...);
} else {
// handle error
}
});
Personally, I would advise that you use promises (especially later, when you have long chains of requests happening one on top of the other), but take your pick.

node.js and socket.io: different connections for different "sessions"

I've got a node.js application that 'streams' tweets to users. At the moment, it just searches Twitter for a hard-coded string, but I'd like to allow users to configure this in the URL (eg. by visiting /?q=stackoverflow).
At the moment, my code looks a bit like this:
app.get('/', function (req, res) {
// page rendering skipped
io.sockets.on('connection', function (socket) {
twit.stream('user', {track: 'stackoverflow'}, function(stream) {
stream.on('data', function (data) {
socket.volatile.emit('tweet', data);
}
});
});
});
The question is, how do I make it so that each user can see a different stream of tweets simultaneously? At the moment, it works fine in a single browser tab, but it falls over as soon as a second one is opened - and the error is fairly deep down inside socket.io. Am I misusing it?
I haven't fully got my head around socket.io yet, so that could be the issue.
Thanks in advance!
Every time a new request comes in, you are redefining the connection callback with io.sockets.on - you should move that block of code outside of app.get, after your initialization statement of the io object.