Hi what is the maximum rate at which fql requests can be made ?
e.g. 1000 FQL requests per day ?
I'm building an app where I will be making alot of FQL requests and I need to know what the limit is.
Limit on number of Graph API calls
I've found 600 calls per 600 seconds, per token & per IP to be about
where they stop you.
https://developers.facebook.com/policy/
If you exceed, or plan to exceed, any of the following thresholds
please contact us as you may be subject to additional terms: (>5M MAU)
or (>100M API calls per day) or (>50M impressions per day).
Related
I am looking at http://stocktwits.com/developers/docs/parameters and am wondering if anyone has used pagination before.
The doc says there is a limit of 800 messages, how does that interact with the request limit? Could I in theory query 200 different stock tickers every hour and get back (up to) 800 messages?
If so that sounds like a great way to get around the 30 message limit.
The documentation is unclear on this and we are rolling out new documentation that explains this more clearly.
Every stream request will have a default and max limit of 30 messages per response, regardless of whether the cursor params are present or not. So you could query 200 different stock streams every hour and get up to 6,000 messages or 12,000 if sending your access token along with the request. 200 request per hour for non authenticated requests and 400 for authenticated requests.
in the Reporting API V4 you can do a batchGet and send up to 5 requests at once.
How does this relate to the quota ? Does it count as one request even if i put multiple ones in the request ?
Limits and quotas
It depends on what limits and quotas you are talking about. Note you can always check the API specific quotas in the Developer Console.
Quota group for the Analytics Reporting API V4:
Each batchGet requests counts as one request against these quotas:
Requests per day per project: 50,000
Requests per 100 seconds per project: 2,000
Requests per 100 seconds per user per project: 100.
Meaning you can put up to 5 requests into each batchGet for a total of 250,000 request per day.
General reporting quotas
There are some quotas general reporting quotas, in which each individual request within a batchGet acts individually against your quota.
10,000 requests per view (profile) per day.
10 concurrent requests per view (profile).
This means if you put 5 requests in a single batchGet and make 2 batchGet requests at the same time you will be at the 10 concurrent requests per view limit, and if you continue to put 5 requests in each batchGet request throughout the day you will only be able to make 2,000 batchGet requests against a single view.
Analytics Reporting API V4 batchGet considerations
A note about the ReportRequest objects within a batchGet method.
Every ReportRequest within a batchGet method must contain the same:
viewId
dateRanges
samplingLevel
segments
cohortGroup
I have a registration REST API which i want to test as -
Register 15000 users and pound the server with repeated incident reports (varying traffic with max being 100 per minute, min being 1 per 24 hrs and avg being one per minute ) over a period of 48 hours.
Which tool can I use to test stability of my REST API?
For pounding the server with incidents over a period of time, you can use http://runscope.com/ .
It is helpful for testing APIs. You can just trigger events in runscope over a period time or schedule it to hit the server as required.
We are using graph API to get number of shares for all post on each page of our client, running once per day, we use graph.facebook.com/post_id, but we offen get
(#613) Calls to stream have exceeded the rate of 600 calls per 600 seconds
I tried using batch request, it seems each request in the batch got counted for the limit. Any suggestions?
Here are our findings so far:
FQL stream table doesn't have a field for "shares".
Post insights have no metric matching the "#shares" as show on page wall.
Graph API call for post will reach limit quickly.
Make fewer calls - that's the only real answer here, assuming you've already taken other optimisations, like asking for multiple posts' details in a single call (via ?ids=X,Y,Z syntax mentioned on the homepage of the Graph API documentation)
Why does it need to be done 'once per day'? Why not spread the calls out over a few hours?
It doesn't matter if you request by batch, each item will still be counted as one hit and you will reach the same limit. It's indicated in the FB docs
https://developers.facebook.com/docs/graph-api/advanced/rate-limiting
You can try distributing your load by timeout or delay in your cron job or something. Or execute the first batch and the next batch in an hour is probably the safest.
Does anyone know whats the limit for batch requests when using FBs graph API?
From their documentation:
Limit
We currently limit the number of batch requests to 20.
Source
That's not clear. Is that 20 per 600 seconds? Per day? Total for one app ever?
It is 50 now. 50 request/batch
It means that 20 individual requests are allowed to be batched together into a single batched request, which saves you from sending 20 individual http requests over at the same time.
If you have more than twenty (20) you can build an array and then break them into groups of 20 or less and loop them thru your PHP in one "session". We have one app that will run 600 - 700 OG requests but it is sloooooooow, up to 300 seconds, some times, depending on FB.