node.js and socket.io: different connections for different "sessions" - sockets

I've got a node.js application that 'streams' tweets to users. At the moment, it just searches Twitter for a hard-coded string, but I'd like to allow users to configure this in the URL (eg. by visiting /?q=stackoverflow).
At the moment, my code looks a bit like this:
app.get('/', function (req, res) {
// page rendering skipped
io.sockets.on('connection', function (socket) {
twit.stream('user', {track: 'stackoverflow'}, function(stream) {
stream.on('data', function (data) {
socket.volatile.emit('tweet', data);
}
});
});
});
The question is, how do I make it so that each user can see a different stream of tweets simultaneously? At the moment, it works fine in a single browser tab, but it falls over as soon as a second one is opened - and the error is fairly deep down inside socket.io. Am I misusing it?
I haven't fully got my head around socket.io yet, so that could be the issue.
Thanks in advance!

Every time a new request comes in, you are redefining the connection callback with io.sockets.on - you should move that block of code outside of app.get, after your initialization statement of the io object.

Related

Extremely high loading times - Requests not running async. Mongoose

Overview
I've built an application with Vue, Express and MongoDB (mongoose ORM).
On loading the landing page, a series of GET requests are made for various bits of data. The loading times are extremely high, I've recorded some times as high as 22s for a particular route. It's lead me to believe that my requests are running sequentially, despite specifying in my logic that everything should run async
I've tried reducing the size of the objects being returned from the requests as well as using the .lean() method. These attempts shaved off a couple of seconds, but the overall issue is not remotely sorted. Times are still stupid high. To give an example:
From This:
// Method to find all users
var users = await User.find({});
To:
// Method to find all users
var users = await User.find({}, "username, uid").lean();
On the page in question, there are about 5 main components. Each component is making a get request. One of these is a Chat Column and the code for it is as follows:
ChatCol.vue
beforeMount () {
this.$store.dispatch('retrieve_chat')
}
Store.js (am using Vuex store)
retrieve_chat (context) {
return new Promise((resolve, reject) => {
axios({
url: api.dev + 'api/v1/chat',
method: 'GET',
})
.then(res => {
context.commit('set_chat', res.data)
resolve(res);
}).catch(err => {
// alert(err)
reject(err);
})
})
},
Requests in this format are being made on all the components. About 5 of them in the page in question.
Backend / Server Code
To give some context into the requests being made.
The client will hit the route 'http://localhost:3000/api/v1/chat'
and the code that makes the request on the server is the following:
var Chat = require("../models/ChatMessage");
module.exports = {
// LIMIT CHAT TO 100 MESSAGES
async get_chat(req, res) {
Chat.find({}, function(err, messages) {
if (err) {
return res.status(500).send({
message: "Interval Server Error",
type: "MONGO_CHAT_DOCUMENT_QUERY",
err: err,
})
}
if (!messages) {
return res.status(400).send({
message: "Resource not found",
type: "MONGO_CHAT_DOCUMENT_QUERY",
details: "!messages - no messages found",
})
}
messages.reverse();
return res.status(200).json({
messages,
});
}).sort({"_id": -1}).limit(30);
},
}
If I look at the network tab on the chrome dev tools, this is how the requests appear. Apologies for the long winded post, I literally have no idea what is causing this
Important Note:
It was mentioned to me that mongodb has this feature where it locks when mutating the data, and I thought that might be the case, but there are no mutations taking place. It's just 3/4 get requests happening in parallel, albeit pretty big requests, but they shouldn't be taking as long as they are
Screenshot of the network tab:
(ignore the failed req, and some of the duplicate named requests)
StackOverflow sempais please help. It's a very big application and I don't know what the issue is exactly, so If I've missed out any details - Apologies, I'll clarify anything that needs clarity.
Large amount of base64 encoded data from a previously abandoned and poorly implemented image upload feature was being stored in each chat message as well as other places, causing large amounts of data to be loaded in and ultimately lead to huge loading times.
Thank you Neil Lunn.

self.addEventListener('fetch', function(e) { }) is not working

I have a doubt in PWA and will be glad if someone helps me with that. In my PWA I don't have any problem with storing static files like HTML, JS & CSS. But am facing Issues on dynamic data. i.e : my self.addEventListener('fetch', function(e) { }) is not getting called, but other functionalities are working fine i.e: 'install' and 'active' event.
to be more particular, I am using #angular/service-worker which worked fine but I created another sw file called sw.js. In my sw-js I'm listening to the events like 'install' 'active' and 'fetch'. My sw.js fetch is not getting called whereas the other two methods work well. But while fetching the ngsw-worker.js's fetch method alone gets called.
The thing I need is to make CRUD Operations in PWA with angular.
Thanks in advance!
You can do the dynamic caching like below , the service worker will intercept every request and add in to the cache.
self.addEventListener("fetch", function (event) {
event.respondWith(
caches.open("dynamiccache").then(function (cache) {
return fetch(event.request).then(function (res) {
cache.put(event.request, res.clone());
return res;
})
})
)
}
Note : You can't cache POST Requests
Can service workers cache POST requests?

How to render react on server-side with an api?

Just to make it clear, I'm using the MongoDB, Express, React and Node stack.
I'm trying to learn react.js right now. I got the basics right and I am able to code a simple react app with a router. I've also tried server-side rendering a simple react app and it also works perfectly. However, I'm kind of stuck now that I want to make a full app with a rest api and server-side rendering.
1) I don't know how I should go about separating the api and the react code in the server file. Would starting off by listing the api calls and then do the server-side rendering work?
Like so:
app.get('/api/whatever', function(req, res) {
//get whatever
});
app.get('*', function(req, res) {
//math routes and renderToString React
});
2) Also, the reason I couldn't even test the above, is that when I try to run the server with nodemon it throws an error because it doesn't understand the react code, how should I go about this? Should I somehow configure nodemon to read es6 or ignore it or configure webpack to run the express server ?
3) The final question that could clear this whole story quite easily. I've tried finding an answer but got many conflicting ones instead. Are the google crawlers capable of crawling a React app? I'm learning server-side rendering for SEO, is that all really necessary?
Sorry for the long question, looking forward to reading your answers.
I do it the same way you do in your code example in the project I'm currently working on – I match * and then use React Router to render different pages. I wrote a blog article about this, with code examples.
in the setup I have, I use webpack to compile my backend code, just like I do with the frontend code. I use the watch mechanism to listen for code changes and automatically restart the node server after recompiling. No need for nodemon.
#!/usr/bin/env node
const path = require('path');
const webpack = require('webpack');
const spawn = require('child_process').spawn;
const serverConfig = require('webpack.config.server');
const compiler = webpack(serverConfig);
const watchConfig = {
aggregateTimeout: 300,
poll: 1000,
ignored: '**/*.scss'
};
let serverControl;
compiler.watch(watchConfig, (err, stats) => {
if (err) {
console.error(err.stack || err);
if (err.details) {
console.error(err.details);
}
return;
}
const info = stats.toJson();
if (stats.hasErrors()) {
info.errors.forEach(message => console.log(message));
return;
}
if (stats.hasWarnings()) {
info.warnings.forEach(message => console.log(message));
}
if (serverControl) {
serverControl.kill();
}
serverControl = spawn('node', [path.resolve(__dirname, '../../dist/polly-server.js')]);
serverControl.stdout.on('data', data => console.log(`${new Date().toISOString()} [LOG] ${data}`));
serverControl.stderr.on('data', data => console.error(`${new Date().toISOString()} [ERROR] ${data}`));
});
yes, Google crawls client-side React code, but server-side rendering is still a good idea, because crawl results may be inconsistent, especially if you load parts of the page dynamically after Ajax calls

What are the options for offline registration and forms?

I have a project that caters for individuals with poor internet connections in predominantly rural areas. I need to allow for users to download(or any other applicable means), or fill out details offline and then when they are ready and the internet connection is ready the data filled out offline should sync with the online database and give a report.
The offline form also needs the same validation as online, to ensure no time wastage.
What are the options I know that HTML 5 has an offline application ability. I would prefer an open source option, which will allow people with intermittent internet issues to continue filling out a form or series of forms even though internet has dropped, and the data sync when internet reconnects.
So what are the best options? Having the user requiring to download a large application is also not the best case, I would prefer a browser or small download solution. Maybe even a way of downloading a validatable form in some format for re-upload.
This is something I've been muddling through myself as some of the users of the site I am currently tasked with building have poor connections or would like to fill in forms away from a network for various reasons. Depending on your precise needs and your customer's browser compatibility, the solution I've decided to go with is to use the HTML5 cache capability you mention in your post.
The amount of data stored is not that great, and it will mean that the webpage you want them to fill in is available offline.
If you couple this with the localStorage interface you can keep all form submissions until they regain connection.
As an example of my current solution:
The cache.php file, to write the manifest
<?php
header("Content-Type: text/cache-manifest");
echo "CACHE MANIFEST\n";
$pages = array(
//an array of the pages you want cached for later
);
foreach($pages as $page) {
echo $page."\n";
}
$time = new datetime("now");
//this makes sure that the cache is different when the browser checks it
//otherwise the cache will not be rebuilt even if you change a cached page
echo "#Last Build Time: ".$time->format("d m Y H:i:s T");
You can then have a simple ajax script checking for connection
setInterval( function() {
$.ajax({
url: 'testconnection.php',
type: 'post',
data: { 'test' : 'true' },
error: function(XHR, textStatus, errorThrown) {
if(textStatus === 'timeout') {
//update a global var saying connection is down
noCon = true;
}
}
});
if(hasUnsavedData) {
//using the key/value pairs in localstorage, put together a data object and ajax it into the database
//once complete, return unsavedData to false to prevent refiring this until we have new data
//also using localStorage.removeItem(key) to clear out all localstorage info
}
}, 20000 /*medium gap between calls, do whatever works best for you here*/);
Then for your form submission script, use localstorage if that noCon variable is set to true
$(/*submit button*/).on("click", function(event) {
event.preventDefault();
if(noCon) {
//go through all inputs in some way and put to localstorage, your method is up to you
$("input").each( function() {
var key = $(this).attr("name"), val = $(this).val();
localStorage[key] = val;
});
hasUnsavedData = true;
//update a global variable to let the script above know to save information
} else {
//or if there's connection
$("form").submit();
//submit the form in some manner
}
});
I've not tested every script on this page, but they're written based on the skeleton of what my current solution is doing, minus a lot of error checking etc, so hopefully it will give you some ideas on how to approach this
Suggestions for improvements are welcomed

node.js: expressjs with mongoose

I'm working on my first node.js / express / mongoose app and I'm facing a problem due to asynchronisation mechanism of node.js. It seems I do not do the thing correctly...
Here is the test route I defined using express:
app.get('/test', function(req, res){
var mod = mongoose.model('MyModel');
mod.find({},function(err, records){
records.forEach(function(record){
console.log('Record found:' + record.id);
// res.send('Thing retrieved:' + record.id);
});
});
});
When I issue a http://localhost/test, I'd like to get the list of records of type 'MyModel' in the response.
The code above is working fine but when it comes to returning this whole list to the client... it does not work (the commented res.send line) and only returned the first record.
I'm very new to node.js so I do not know if it's the good solution to embed several callback functions within the first callback function of app.get . How could I have the whole list returned ?
Any idea ?
What you should be doing is:
mod.find({},function(err, records){
res.writeHead(200, {'Content-Length': body.length});
records.forEach(function(record){
res.write('Thing retrieved:' + record.id);
});
});
Please always check the documentation as well:
http://nodejs.org/docs/v0.3.8/api/http.html#response.write
I missed that you was using express, the send function is part of express and extend's the serverResponse object of node (my bad).
but my answer still applies, express's send function sends the data using ServerResponse.end() so there for the socket get's closed and you cannot send data any more, using the write function uses the native function.
You may also want to call res.end() when the request is fully completed as some item's within express may be affected