There have been more than a few posts on this, but I still can't get my head wrapped around this particular problem.
If I have to call three external sites, Facebook, Twitter, and Instagram, and I launch all three with async parallel, and it hangs up for 20 seconds waiting for Facebook to respond, I'm unsure what happens. Does the call the Twitter start, and Instagram, and they both run and possibly complete before the call to Facebook, or does that entire user thread pause until the next round of the event loop, and another users call go through? It looks like the latter to us. Or does the entire thread grind to a halt and wait?
Per the request, here is a cut down version of the code. Service_requests is an array of functions that call Facebook, Instagram, and Twitter. There was a bunch of other stuff going on in here as well, but the meat of it is pretty simple.
Async.parallel(service_requests, function (err, results)
{
if (err) { next(err); return; }
var articles = [];
for (var i = 0; i < results.length; i++)
{
articles = articles.concat(results[i]);
}
next(null, articles);
});
When you make parallel requests to services they all happen 'at the same time'. So if facebook took 20 seconds to return, twitter took 15 seconds and instagram took 25 seconds, you will have all of your data in 25 seconds from when you initiate all of your requests.
moreover, when node waits for responses from these services it its free to service other requests.
the event loop does not grind to a halt and wait for the responses, it continues to do other work if other work exists.
when all 3 services respond with data the async lib will call your final callback with the results.
Related
I'm using actions-on-google github nodejs app with DialogFlow. I'm trying to figure out how to make an async api call that takes longer than 5 seconds and return the response to the user when the response is ready, considering that actions on google returns Malformat error if a response is not received from the intent within 5 seconds.
This is a simple code snippet of my code:
app.intent('first-intent', async (conv: any) => {
conv.ask('Please wait while we make our long api call...');
await myPrivateFunction();
})
// I have put API_RESPONSE_RECEIVED as Events in DialogFlow
app.intent('second-intent', (conv: any) => {
console.log('This is second-intent');
var response = conv.data.apiResponse;
conv.ask(response);
})
function myPrivateFunction(): Promise<void> {
utils.apiCall().then(apiResponse => {
console.log('api response received');
conv.data.apiResponse = apiResponse;
conv.followup('API_RESPONSE_RECEIVED');
});
}
In my Firebase logs I can see "Please wait while we make our long api call..." and "api response received", but not "This is second-intent". If I put conv.followup('API_RESPONSE_RECEIVED') outside the api call right after conv.ask('Please wait while we make our long api call...'), I see "This is second-intent". So, app.followup looks OK and apparently the problem is about how I'm handling the promise, but I don't know how to fix it.
I'm using TypeScript targeting es5 in my development environment. So, I can use await/async on my api call. However, using await causes that malformat error since the api call takes longer than 5 seconds.
We don't really have a good way to handle this right now, but we have a couple of approaches that sorta work based on your needs.
Notifications are currently available for the Assistant on smartphones, and they're coming for speakers. In some cases, it might make sense to say that you're working on the problem and you'll send a notification when you have it, and then resume the conversation from the notification.
Another approach is to use the Media Response to play a bit of "hold music". Under this scheme, you would start the async call, but also immediately send back the "hold music response". When the long async call completes, it would save the result in a local cache. At the end of the segment of music, your webhook will get a notice that the music has completed. If you have the result available in the local cache, you can report it at that time, otherwise you would play more hold music and repeat this process.
I have written an app that talks to Facebook messenger. I kept getting the same message from Facebook over and over.
(It's an Express app)
app.post('/webhook23', (req, res) => {
(This is the quantity that was the same each time I got messaged)
req.body.entry[0].messaging.events[0]
(Here's what I got)
{
"sender":{
"id":"1075583395843601"
},
"recipient":{
"id":"1408746912473828"
},
"timestamp":1472824205937,
"message":{
"mid":"mid.1472824205931:3f118e895201854e72",
"seq":122,
"text":"Yes."
}
}
I eventually broke out of this by writing a little program that accepted whatever Facebook said, and returned a 200 status. I drained the pipe, and then I was able to work.
Am I understanding this correctly though? It'll just keep on trying forever? Further messages just wait until the one in front is successfully processed? Is this a good design?
Regards, Rick
Basically this is what I want to do
- Get all my likes (page likes)
- Get all my friends
- for each of my friends, get their likes
- for each of their likes, do something
Now, the way I'm doing this is by using JS SDK, because I've tried PHP SDK and it's really slow (so slow that PHP error of maximum execution time kicks in), is PHP SDK always slower than JS? My script is something like this :
var newArray = [];
FB.api('me/likes', function(response){
FB.api('me/friends', function(friends){
$(friends).each(function(){
FB.api(this.uid+'/likes', function(fr_likes){
$(fr_likes).each(function(){
//save this friend likes to newArray
newArray.push(this);
});
});
});
});
});
//call newArray outside FB scope doesn't work at first
console.log(newArray); //returns [] / empty
But if I use chrome console to call newArray after awhile, newArray is slowly populated with FB data.
So my question is :
Can I wait for all the FB.api calls to be complete before doing something outside FB scope?
What is the best practice for doing something like that (recursive FB api calls)?
Thanks for the answer
You should read up on the Batch API. It allows you to make multiple requests in one single round-trip to Facebook's servers. Should speed things up considerably.
What is the best practise to create a notification services such as in facebook website. I see it is noy good to make a http request periodically to check if there is updates on the server or not.
This is calling long polling (type of an AJAX).
I'll try to describe a situation where you use PHP and JS, as Facebook does.
You send an AJAX request to the server.
Infinite loop starts on the server side
<?php
$seconds = 0;
while(true) {
if ($seconds >= 55) {
die("no_notifications");
}
if (false !== ($notifications_json = getNotifications()) {
echo $notifications_json;
die();
}
$seconds++;
sleep($TIME_TO_WAIT_BEFORE_NEXT_CHECK); //this number should be based on your performance
}
?>
When there's a new notification, script die() s and response is handled by javascript
New request is sent to the server, again, waiting to new notifications
With javascript (I'll show you an example with jQuery), you can use something like
$(function() {
pollForNotifications();
});
function pollForNotifications() {
$.get("/notifications", function(response) {
if (response != "no_notifications") {
alert("You've got one new notification!");
//more proccessing here
}
pollForNotifications();
});
}
Remember that there are time limits in specific browsers to complete the request! You SHOULD die() after some amount of time (55 seconds) even if you don't have any notifications, to prevent troubles (this prevention is included in script above)!
You could use the WebSocket api for a real push service (not ajax polling). But it is part of HTML5 and not supported by all browsers and web servers at the moment.
You might want to check out Pusher. It'll handle pushing notifications and supports many browsers.
I want to use node.js to boost my Facebook applications performance. Imagine application that tries to compute who is you best friend on Facebook, fetching a lot of data from API calls - determining how many times your friends have posted to your wall, how many photos you have marked on together - so on.
So instead of running that calls one after one, as I do using PHP I have an idea to send them all together to Facebook using non-blocking, asynchronous nature of Node.js.
So overall execution time will be the time of most time consuming API call, but not the sum of execution time of the every call, right?
I try to use node-facebook-sdk (https://github.com/tenorviol/node-facebook-sdk) to make Facebook API calls and it seems to be that it's blocking, isn't it?
Just quick and dirty modification of example code, requesting 3 user profiles, seems that calls are not asynchronous, each sending to Facebook after previous has completed. Are there any way to avoid that?
Thank in advance!
var fbsdk = require('facebook-sdk');
var facebook = new fbsdk.Facebook({
appId : '_APPID_',
secret : '_SECRET_'
});
var i = 0;
setInterval(function() {
i++;
console.log("TICK " + i);
}, 500);
facebook.api('/100000997949108', function(data) {
console.log(data);
});
facebook.api('/1609464095', function(data) {
console.log(data);
});
facebook.api('/100000560820400', function(data) {
console.log(data);
});
This library will help you out with all things async. I hashed the particular method you would want to use for your problem, but the library as a whole is excellent at abstracting some of the more tedious (and ugly!) async patterns away. Great transitioning tool for those coming from procedural and you can take a peak under the covers if you want to learn some async patterns.
https://github.com/caolan/async/#parallel