I am using sails.js v0.10.5.
I have long processing controller action which takes more than 2 minutes to complete which causes the sails server to timeout. How can I overwrite this timeout duration and set it to my value say 10 minutes.Please specify in which file and what changes I need to make to increase the sails server timeout.
You can find which action delay the starting process
sails --verbose lift
most of the time, ORM causes this kind of issue.
UPDATE
i think the timeout is provided by Express, if you want to modify it, you can use connect-time
1.install
$ npm install connect-timeout --save
2.add middleware to http.js
var timeout = require('connect-timeout');
......
order: [ 'timeout' ....]
My answer is pretty similar to the provided by Ryan, but I'll expand the example a bit more for clarity. Also I don't see the need for the dependency installation there.
TL;DR Add a custom middleware that amends req to increase the timeout
In /config/http.js
order: [
'startRequestTimer',
'cookieParser',
'session',
'myRequestLogger',
'contentTypeCheck',
'extendTimeout', // <--- The order is somewhat important, added it before the body is parsed without issue
'bodyParser',
'handleBodyParserError',
'compress',
'methodOverride',
'poweredBy',
'$custom',
'router',
'www',
'favicon',
'404',
'500'
],
extendTimeout: function(req, res, next) {
req.setTimeout(300 * 1000); // Increase the request timeout to 5 minutes
return next();
},
// Other custom middleware
Can also set req.setTimeout(0) if you don't want any timeout at all, but could be a bit risky.
This will set the timeout for all routes keep in mind. This can be overridden on a per route basis by simply setting req.setTimeout(<some timeout>) as the first line in the controller handling the route
yjimk, thanks for your idea... it wasn't working for me first, but I found a solution.
Problem was in extendTimeout function, here is the fixed version:
extendTimeout: (function () {
console.log('Initializing `extendTimeout`...');
return function (req, res, next) {
sails.log.info('extendTimeout to 1h');
req.setTimeout(3600000);
return next();
};
})(),
Related
Consider the following:
$router->group([
'prefix' => 'api/v1/group',
'middleware' => 'auth'
], function () use ($router) {
$router->get('/', [
'as' => 'group.list',
'uses' => 'Api\V1\GroupController#list'
]);
$router->post('/', [
'as' => 'group.create',
'uses' =>'Api\V1\GroupController#create'
]);
$router->get('/{groupUUID}', [
'as' => 'group.retrieve',
'uses' =>'Api\V1\GroupController#retrieve'
]);
$router->put('/{groupUUID}', [
'as' => 'group.update',
'uses' => 'Api\V1\GroupController#update'
]);
});
As you can see, a pretty typical route setup. However, I'm seeing some incredibly odd behaviour - in short, the POST route seems to be being interpreted by the app as a GET route. When I make a POST request to api/v1/group (via Postman) I don't see the result of Api\V1\GroupController#create, but the result of Api\V1\GroupController#list.
I wondered if perhaps this was something to do with both routes having the same endpoint (shouldn't matter, but maybe it's different in Lumen? I usually work in full-on Laravel). So I commented out the get route. That made me just see a 404.
I then wondered if perhaps this entire route group was somehow broken. So I made two catchall endpoints:
$router->get('/{any:.*}', function () use ($router) {
return 'I am a get route';
});
$router->post('/{any:.*}', function () use ($router) {
return 'I am a post route';
});
And placed them at the top of the routes file, and commented out all other routes. Regardless of the route I hit or the method used, I always saw the same thing: I am a get route.
What's going on? What could cause my app to understand all POST requests as GET requests?
PS: It's also worth noting that these routes were working, until recently, without any real associated changes. Could something have been updated in a Lumen package that caused this?
PPS: I also tried using Insomnia instead of Postman, just in case it was a problem with Postman. Same result.
$router->get('/api/item/{table}/{id}', "ItemController#itemHandler");
$router->post('/api/item/{table}', "ItemController#itemHandler");
$router->put('/api/item/{table}/{id}', "ItemController#itemHandler");
$router->delete('/api/item/{table}/{id}', "ItemController#itemHandler");
I had pretty much the same issue. In my case - since I use Laravel Valet as development environment - I was able to make a POST request again, after serving the API locally over HTTP by executing valet unsecure my-project. On my production server, I still can use HTTPS, but for my local development environment, this solved the issue. Hope this helps some future readers.
Try api/v1/group/ (with trailing slash).
I'm in the middle of upgrading our API from Sails v0.12 -> v1, which was prompted by the use of self-validating machines for controller actions. After finally getting through a ton of headache replacing deprecated code, I've landed in a rough spot...
With v0.12 (rather, with the older "req, res" controller style), one could use custom response handlers across the board. I've taken advantage of this, and have request logging at the end of all our response types (with some additional sugaring of data). This was done to log all requests in the database, so we can get insights into what our production servers are doing (because they are load-balanced, having a central place to view this is a must, and this was an easy route to take).
So now, my problem is moving forward with "Actions2" machine-style actions. How does one use these custom response types in these things? Are we being forced to repeat ourselves in our exists? I can't find any good documentation to help guide this process, nor can I find a consistent way to "hook" into the end of a response using machines as actions. I can't find any documentation on what kind of options machines can give to Sails.
#Nelson yes, I understand that, but at the time, that isn't what I wanted at all. I wanted all of the benefits of Actions2.
EDIT: While the original, crossed-out comment below does still work, the prefered way to use Actions2 and the custom responses folder paradigm, is to do something similar to the following in an Actions2 file:
module.exports = {
friendlyName: 'Human-friendly name of function',
description: 'Long description of function and what it does.',
inputs: {
userCommand: {
type: 'string',
required: true,
description: 'Long, human-readable description of the input'
}
},
exits: {
success: {
responseType: 'chatbotResponse'
}
},
fn: async function(inputs, exits){
// do some crazy stuff with the inputs, which has already been validated.
return exits.success('Woot');
}
}
This ultimately will route through the responses/chatbotResponse.js, which looks something similar to this:
module.exports = async function chatbotResponse(data){
let res = this.res,
req = this.req;
if (!data) {
data = 'Something didn\'t go as planned...';
}
// how to call a Node Machine style helper with named inputs
await sails.helpers.finalizeRequestLog.with({req: req, res: res, body: {plainString: data}});
return res.json(data);
};
ORIGINAL:
As it turns out, in the Actions2 function, you just need to add the env param async function(inputs, exists, env). The env will give you access to the req and res. So, if you have custom responses, that perform special tasks (like request logging), you can just use return await env.res.customResponse('Hurray, you made a successful call!');
I am trying to connect to my socket using socket.io-client
I'm able to make a get request using
this.socket.emit('get', {url: '/socket'}, (res) => console.log(res.body) );
But When i'm making a post request i don't know how to pass data into the request
this.socket.emit('post', {url: '/socket'},{message:"Hola"}, (res) => console.log(res.body) );
Try this.socket.post('<yourBaseURL>' + '/socket',{message: "Hola"});
This is a method that worked for me, see if it does for you too. Make sure that you have the complete POST Url and not just the relative path like you provided in your example.
So I finally got it working after messing around with it for a while. The solution was that in the options the key needed a data field to tell it to put it in the body of the Request.
socket.emit('post', {url: '/socket', data:{message:"Sending Successfully"}}, (res) => console.log(res.body) );
In the server now you can simply get the req and open req.body.message and the message will be right there
I'm currently trying out the Sails framework and so far I'm quite impressed. One of the odd things I noticed however is that the server returns a 200 OK rather than a 304 Not Modified status code for all records (even when unchanged).
Is there a way to make Sails return a 304 for unmodified records? The reason I'm asking is that this seems to be the best practice and used by some big players like Google and Facebook.
The short answer is yes, you simply have to set the Last-Modified header on your responses.
"Sails is built on Express", which uses fresh (npmjs.org/package/fresh) to compare the request and response headers.
Simple example (based on Sails 0.10.0-rc5):
sails new test304response
cd test304response
sails generate api user -> generates User.js and UserController.js
edit api/models/User.js
module.exports = {
schema: true,
attributes: {
name: {
type: 'string',
required: true
}
}
};
edit api/controllers/UserController.js
module.exports = {
find: function (req, res, next) {
console.log('find:', req.fresh);
User.findOne(req.param('id'), function foundUser(err, user) {
// set the Last-Modified header to the updatedAt time stamp
// from the model
res.set('Last-Modified', user.updatedAt);
res.json(user);
});
},
};
sails lift
go to localhost:1337/user/create?name=Joe -> creates a new User
go to localhost:1337/user/1 -> queries the User with id 1
refresh localhost:1337/user/1 -> queries the same User, Last-Modified wasn't changed
The response has a 304 – Not Modified status (even in the Chrome DevTools, which actually do caching as long as you don't explicitly disable it in the settings).
Disclaimer: I just started to learn sails and also node, so I might have missed a simpler/cleaner solution. I'm also not completely certain, that setting Last-Modified is enough in all cases. However, I had the feeling that you are more interested to know if it is possible and not to get a best practice implementation.
Hope this helps. :)
I'm working on my first node.js / express / mongoose app and I'm facing a problem due to asynchronisation mechanism of node.js. It seems I do not do the thing correctly...
Here is the test route I defined using express:
app.get('/test', function(req, res){
var mod = mongoose.model('MyModel');
mod.find({},function(err, records){
records.forEach(function(record){
console.log('Record found:' + record.id);
// res.send('Thing retrieved:' + record.id);
});
});
});
When I issue a http://localhost/test, I'd like to get the list of records of type 'MyModel' in the response.
The code above is working fine but when it comes to returning this whole list to the client... it does not work (the commented res.send line) and only returned the first record.
I'm very new to node.js so I do not know if it's the good solution to embed several callback functions within the first callback function of app.get . How could I have the whole list returned ?
Any idea ?
What you should be doing is:
mod.find({},function(err, records){
res.writeHead(200, {'Content-Length': body.length});
records.forEach(function(record){
res.write('Thing retrieved:' + record.id);
});
});
Please always check the documentation as well:
http://nodejs.org/docs/v0.3.8/api/http.html#response.write
I missed that you was using express, the send function is part of express and extend's the serverResponse object of node (my bad).
but my answer still applies, express's send function sends the data using ServerResponse.end() so there for the socket get's closed and you cannot send data any more, using the write function uses the native function.
You may also want to call res.end() when the request is fully completed as some item's within express may be affected