Can anyone see why, this would crash my app server?
// note_routes.js
module.exports = function(app, db) {
app.post('/notes', (req, res) =>
{
console.log(req.body)
res.send('Hello')
});
};
I am following the tutorial on how to build a simple node server api on medium, https://medium.freecodecamp.org/building-a-simple-node-js-api-in-under-30-minutes-a07ea9e390d2
Yet, I cant figure out how to get past this error?
Any tips would be appreciated.
To close off this question, I'll summarize in an answer.
app needs to be passed as a function reference, not as a string.
Change this:
require('./app/routes')('app',{})
to this:
require('./app/routes')(app,{})
Related
I was attempting to make a post request with FormData to my local server using axios inside React. Here is the method that was posting to the server:
uploadHandler = () => {
const formData = new FormData();
formData.append(
"myFile",
this.state.selectedFile,
this.state.selectedFile.name
);
axios({
method: "post",
url: "/",
data: formData
}).then(response => {
});
};
For the last three hours, I kept getting this error in my chrome console:
net::ERR_CONNECTION_ABORTED when firing the method. I could not figure out what was wrong and tried adding various headers to the request, but nothing worked.
Then, all of a sudden, it just WORKED. I had no idea why and hit ctrl z until I could pinpoint the exact change that got it to work. The hunt turned up nothing and now I can't get the error no matter how hard I try. This is driving me crazy not knowing what the issue was that took 3 hours of my day!
Has anyone experienced something similar? Am I missing something here? Any input is greatly appreciated!
Overview
I've built an application with Vue, Express and MongoDB (mongoose ORM).
On loading the landing page, a series of GET requests are made for various bits of data. The loading times are extremely high, I've recorded some times as high as 22s for a particular route. It's lead me to believe that my requests are running sequentially, despite specifying in my logic that everything should run async
I've tried reducing the size of the objects being returned from the requests as well as using the .lean() method. These attempts shaved off a couple of seconds, but the overall issue is not remotely sorted. Times are still stupid high. To give an example:
From This:
// Method to find all users
var users = await User.find({});
To:
// Method to find all users
var users = await User.find({}, "username, uid").lean();
On the page in question, there are about 5 main components. Each component is making a get request. One of these is a Chat Column and the code for it is as follows:
ChatCol.vue
beforeMount () {
this.$store.dispatch('retrieve_chat')
}
Store.js (am using Vuex store)
retrieve_chat (context) {
return new Promise((resolve, reject) => {
axios({
url: api.dev + 'api/v1/chat',
method: 'GET',
})
.then(res => {
context.commit('set_chat', res.data)
resolve(res);
}).catch(err => {
// alert(err)
reject(err);
})
})
},
Requests in this format are being made on all the components. About 5 of them in the page in question.
Backend / Server Code
To give some context into the requests being made.
The client will hit the route 'http://localhost:3000/api/v1/chat'
and the code that makes the request on the server is the following:
var Chat = require("../models/ChatMessage");
module.exports = {
// LIMIT CHAT TO 100 MESSAGES
async get_chat(req, res) {
Chat.find({}, function(err, messages) {
if (err) {
return res.status(500).send({
message: "Interval Server Error",
type: "MONGO_CHAT_DOCUMENT_QUERY",
err: err,
})
}
if (!messages) {
return res.status(400).send({
message: "Resource not found",
type: "MONGO_CHAT_DOCUMENT_QUERY",
details: "!messages - no messages found",
})
}
messages.reverse();
return res.status(200).json({
messages,
});
}).sort({"_id": -1}).limit(30);
},
}
If I look at the network tab on the chrome dev tools, this is how the requests appear. Apologies for the long winded post, I literally have no idea what is causing this
Important Note:
It was mentioned to me that mongodb has this feature where it locks when mutating the data, and I thought that might be the case, but there are no mutations taking place. It's just 3/4 get requests happening in parallel, albeit pretty big requests, but they shouldn't be taking as long as they are
Screenshot of the network tab:
(ignore the failed req, and some of the duplicate named requests)
StackOverflow sempais please help. It's a very big application and I don't know what the issue is exactly, so If I've missed out any details - Apologies, I'll clarify anything that needs clarity.
Large amount of base64 encoded data from a previously abandoned and poorly implemented image upload feature was being stored in each chat message as well as other places, causing large amounts of data to be loaded in and ultimately lead to huge loading times.
Thank you Neil Lunn.
Just to make it clear, I'm using the MongoDB, Express, React and Node stack.
I'm trying to learn react.js right now. I got the basics right and I am able to code a simple react app with a router. I've also tried server-side rendering a simple react app and it also works perfectly. However, I'm kind of stuck now that I want to make a full app with a rest api and server-side rendering.
1) I don't know how I should go about separating the api and the react code in the server file. Would starting off by listing the api calls and then do the server-side rendering work?
Like so:
app.get('/api/whatever', function(req, res) {
//get whatever
});
app.get('*', function(req, res) {
//math routes and renderToString React
});
2) Also, the reason I couldn't even test the above, is that when I try to run the server with nodemon it throws an error because it doesn't understand the react code, how should I go about this? Should I somehow configure nodemon to read es6 or ignore it or configure webpack to run the express server ?
3) The final question that could clear this whole story quite easily. I've tried finding an answer but got many conflicting ones instead. Are the google crawlers capable of crawling a React app? I'm learning server-side rendering for SEO, is that all really necessary?
Sorry for the long question, looking forward to reading your answers.
I do it the same way you do in your code example in the project I'm currently working on – I match * and then use React Router to render different pages. I wrote a blog article about this, with code examples.
in the setup I have, I use webpack to compile my backend code, just like I do with the frontend code. I use the watch mechanism to listen for code changes and automatically restart the node server after recompiling. No need for nodemon.
#!/usr/bin/env node
const path = require('path');
const webpack = require('webpack');
const spawn = require('child_process').spawn;
const serverConfig = require('webpack.config.server');
const compiler = webpack(serverConfig);
const watchConfig = {
aggregateTimeout: 300,
poll: 1000,
ignored: '**/*.scss'
};
let serverControl;
compiler.watch(watchConfig, (err, stats) => {
if (err) {
console.error(err.stack || err);
if (err.details) {
console.error(err.details);
}
return;
}
const info = stats.toJson();
if (stats.hasErrors()) {
info.errors.forEach(message => console.log(message));
return;
}
if (stats.hasWarnings()) {
info.warnings.forEach(message => console.log(message));
}
if (serverControl) {
serverControl.kill();
}
serverControl = spawn('node', [path.resolve(__dirname, '../../dist/polly-server.js')]);
serverControl.stdout.on('data', data => console.log(`${new Date().toISOString()} [LOG] ${data}`));
serverControl.stderr.on('data', data => console.error(`${new Date().toISOString()} [ERROR] ${data}`));
});
yes, Google crawls client-side React code, but server-side rendering is still a good idea, because crawl results may be inconsistent, especially if you load parts of the page dynamically after Ajax calls
I've got a node.js application that 'streams' tweets to users. At the moment, it just searches Twitter for a hard-coded string, but I'd like to allow users to configure this in the URL (eg. by visiting /?q=stackoverflow).
At the moment, my code looks a bit like this:
app.get('/', function (req, res) {
// page rendering skipped
io.sockets.on('connection', function (socket) {
twit.stream('user', {track: 'stackoverflow'}, function(stream) {
stream.on('data', function (data) {
socket.volatile.emit('tweet', data);
}
});
});
});
The question is, how do I make it so that each user can see a different stream of tweets simultaneously? At the moment, it works fine in a single browser tab, but it falls over as soon as a second one is opened - and the error is fairly deep down inside socket.io. Am I misusing it?
I haven't fully got my head around socket.io yet, so that could be the issue.
Thanks in advance!
Every time a new request comes in, you are redefining the connection callback with io.sockets.on - you should move that block of code outside of app.get, after your initialization statement of the io object.
I'm working on my first node.js / express / mongoose app and I'm facing a problem due to asynchronisation mechanism of node.js. It seems I do not do the thing correctly...
Here is the test route I defined using express:
app.get('/test', function(req, res){
var mod = mongoose.model('MyModel');
mod.find({},function(err, records){
records.forEach(function(record){
console.log('Record found:' + record.id);
// res.send('Thing retrieved:' + record.id);
});
});
});
When I issue a http://localhost/test, I'd like to get the list of records of type 'MyModel' in the response.
The code above is working fine but when it comes to returning this whole list to the client... it does not work (the commented res.send line) and only returned the first record.
I'm very new to node.js so I do not know if it's the good solution to embed several callback functions within the first callback function of app.get . How could I have the whole list returned ?
Any idea ?
What you should be doing is:
mod.find({},function(err, records){
res.writeHead(200, {'Content-Length': body.length});
records.forEach(function(record){
res.write('Thing retrieved:' + record.id);
});
});
Please always check the documentation as well:
http://nodejs.org/docs/v0.3.8/api/http.html#response.write
I missed that you was using express, the send function is part of express and extend's the serverResponse object of node (my bad).
but my answer still applies, express's send function sends the data using ServerResponse.end() so there for the socket get's closed and you cannot send data any more, using the write function uses the native function.
You may also want to call res.end() when the request is fully completed as some item's within express may be affected