Fastest way to query a users news feed on Facebook - facebook

As part of an application I am doing we need to retrieve a users facebook news feed (preferably about 150 posts).
var client = new FacebookClient(accessToken);
result = client.Get(connection);
This above is taking roughly 2 seconds
However when I increase the query to be 150 posts
var client = new FacebookClient(accessToken);
result = client.Get(connection, new { fields = "name,from,story,message,picture,comments", limit = count });
This is now taking 6-8 seconds. This is not a nested query so am I right in thinking FQL would give me no performace increase? Is this sort of response time about the best I can hope for?

Doing multiple queries is probably better. You can fire off each request asynchronously to the server and progressively load the data. So, the first call loads posts in ~2 seconds, and then after ~4 seconds you will get the second batch of data from Facebook. Repeat until you get the desired number of posts.
This will mean the user will get to see the data quicker, while your app processes smaller chunks of data.
Take care in coding the loop and account for failure. I.e. if a call fails retry the call or fail gracefully.

Related

Having a static list/seq of users with 10K records, is this threadsafe?

Say I have a static reference to a list/seq of a collection of users:
val users = List(User(..), User(...))
In my controllers, depending on the querystring filters passed in I will filter the users collection.
/users/find?locationId=1&age=30
The action would look something like:
def findUsers(...) = Action {
val filteredUsers = users.filter(.....)
Ok(filteredUsers)
}
So if this endpoint is getting 10K requests per second, the fact that the users reference is a val and I a am simply filtering the results in a read-only manner, this endpoint should be blazing fast correct?
The second part to this question is, since I cannot hard code 10K users in a collection, what would be the best way to mimic this behaviour or am I forced to make this a var in this case if I load the data from a db?
var users = userService.getAll()
I would need to reload this users periodically, like maybe every 3-4 hours.
So if this endpoint is getting 10K requests per second, the fact that
the users reference is a val and I a am simply filtering the results
in a read-only manner, this endpoint should be blazing fast correct?
Yes, no concerns with thread safety here. If you use something that refreshes this list you might get varying responses if 2 clients hit the same url when cache is being refreshed. It's possible to remediate this if that's an issue. In most cases it's not a problem.
You could use a var if you want to implement refresh yourself. There are other ways like using an actor which will hold this state. However, the best option I think is already provided by Play framework: ScalaCache
https://www.playframework.com/documentation/2.8.x/ScalaCache
It has cache refresh and expiry.
If you want further speedups you can cache results of your filtering if it makes sense for you. So it could be double cache for all results and filtered results or just for filtered results. Depends on your needs.

Facebook API - reduce the amount of data you're asking for, then retry your request for 1 row

I have the following logic for my ad insights request:
If Facebook asks me to reduce the amount of data I'm requesting, I half the date range. If the date range is the same, I half the limit.
It gets to the point I send this request:
https://graph.facebook.com/v3.2/{account}/insights?level=ad&time_increment=1&limit=1&time_range={"since":"2019-03-29","until":"2019-03-29"}&breakdowns=country&after=MjMwNwZDZD
But I still get that error:
Please reduce the amount of data you're asking for, then retry your request
There is no more reducing I can do.
Note, that this only happens sometimes.
One way to avoid the error is when you only request 1 item (limit=1) to start splitting the fields and request half the fields in each request.
Another way is to run an async report, which should not have such a low time limit.
Official Facebook API team response:
It looks like you are requesting a lot of fields, this is likely the
cause of this error. This will cause the request to time-out.
Could you try using asynchronous requests as described here:
https://developers.facebook.com/docs/marketing-api/insights/best-practices/#asynchronous?
Async requests have a much longer time limit, this will likely resolve
your issue.

Unexpected/unpredictable results with batch requests to Facebook Marketing Insights API

I have a Google Apps script that's sending batch requests to the Facebook Marketing API (Insights). I'm fetching mobile installs, ad spend per day over several campaigns. However, the results are unexpected for large amounts of data.
Each relative URL in the batch is requesting a day-wise breakdown for a single Facebook campaign as follows:
{"method":"GET",
"relative_url":"<CAMPAIGN_ID>/insights?fields=actions,spend&time_range={'since':'yyyy-mm-dd','until':'yyyy-mm-dd'}&time_increment=1"}
For a given date range, I'm creating a batch request URL for n such campaigns as follows:
var fbCampaigns = [ C1, C2, C3 ... ];
var batchRequests = [];
for(var i=0; i<fbCampaigns.length; i++) {
// URL encoded version of the relative URL above
batchRequests.push("%7B%22method%22%3A%22GET%22%2C%22relative_url%22%3A%22"
+ fbCampaigns[i]+"%2Finsights%3Ffields%3Dactions%2Ccampaign_id%2Cspend%26"
+ "time_range%3D%7B%27since%27%3A%27"+start+"%27%2C%27until%27%3A%27"
+ end+"%27%7D%26time_increment%3D1%22%7D");
}
var url = "https://graph.facebook.com/v2.11/?batch=["
+ batchRequests.join(",")
+ "]&access_token="+fbToken;
Since the URL was getting too long, I split the campaign array into parts of 5 and ran the above for each part separately.
This works great for a single date or a short date range. However, for much larger date ranges (100+), it would begin by fetching correct data, and then suddenly start to retrieve data for either all, only some or none of the campaigns, quite unpredictably.
I didn't get any error codes or warnings about throttling. My question is, is there a limit somewhere that I'm missing in either the number of allowed dates or the number of batch requests? It's rather odd, because I'm only placing three batch requests for my entire data.
Found the issue! After looking for patterns I realised that for each campaign, a maximum of 51 dates worth of data were being fetched.
Absolutely couldn't find details about this hidden limit on the web - if anyone has any further information please add to this.
Edit: Have since realised there's a pagination system that I had missed in the documentation.

Facebook Request(s): what counts as 1 request?

I am currently creating an application that polls facebook for data. First request a page in this fashion...
pageID/posts?fields=id,message,created_time,type&limit=250
This returns the top 250 posts from a page. I then check if there page next is set and if it is make another request for the next 250 posts. I continue this recursively until there are no more posts.
With each post that is returned I go out and fetch the post details from the graph api as well.
My question is if I had 500 posts on a page. Would that equate to 502 requests? (500 requests for each post + 2 for parsing through page data to get posts) or am I incorrect in my understanding of a "request". I know when batching calls each query included in the batch actually counts as 1 request. The goal is to avoid the 600 calls / 600 second rate limiting. Thanks!
Every API call is...well, 1 request. So every time you use the /posts endpoint with whatever limit, it will be 1 request. For example, if you do that call you posted, it will be one request that returns 250 elements.
Batch requests are just faster, but each call in the batch counts as a request. So if you combine 10 calls in a batch, it will be 10 requests. The benefit of batch calls is really just that they are a lot faster: as fast as the slowest call in the batch.
If you want to get 500 posts with that example of yours, you would only need 2 calls. First one with 250 returned elements, second one by using the API call defined in the "next" value to get another 250. Just keep in mind that the default is usually 25 elements, and you can´t use any limit you want. There is a max limit for calls and it gets changed from time to time afaik so don´t count on getting the same result every time.
Btw, don't be to fixated on that 600calls/600seconds limit, it's just a general limit. The real limit is dynamic and depends on many factors. It's not public, of course. But if you really hit the limit, you are doing something wrong anyway.

FQL/GRAPH Api Logic

I have a logic problem I can't seem to solve (might be possible).
Example:
I am inside 100 facebook groups
I need the 10 lastest posts of EACH group I am in.
That's pretty much it but I can't seem to find a way to do this without making a foreach loop calling the api over and over again, if I had a couple hundred more groups it would be impossible.
PS: I'm using FQL atm but am able to use graph, I've coded this in like 3 different ways but no success.
This is the farthest I could get:
SELECT actor_id,source_id FROM stream WHERE source_id IN (select gid from group_member where uid = me())
It only returns from one page, maybe there's no way to return all of this without a foreach asking for each groups 10 lastest messages.
There's no need to use FQL of batching. This can be done with a simple Graph API request IMHO:
GET /me/groups?fields=id,name,feed.fields(id,message).limit(10)
This will return 10 posts for each of your groups. In case there too much data to be returned, try setting the limit parameter for the base query as well:
GET /me/groups?fields=id,name,feed.fields(id,message).limit(10)&limit=20
Then, you'll get a next field in the result JSON. By calling the URL contained in this field, you'll get your next results. Do this until the result is empty, then you reached the end.
You can use batch calls, described here https://developers.facebook.com/docs/graph-api/making-multiple-requests/
Using batch requests, you can request upto 50 calls in one go. Note than batch request doesn't increase the rate limits, so if you make 50 requests in batch, it will be considered as 50 calls, and not one. However you will get the response in a shorter time.
If you think you're making too many calls, you should put some delay in between calls and avoid rate limiting.