GitHub's REST API for events not reporting all the 300 public events although it is within 90 days - rest

According to GitHub's REST API events documentation (https://docs.github.com/en/rest/activity/events), I should be getting events that have been made by a user in the past 90 days (max 300 events). But for some usernames, I am not able to get all the 300 events even though it is within 90 days.
A minimum working example is as follows:
https://api.github.com/users/github-actions[bot]/events?per_page=100&page=1 - gives 100 events
https://api.github.com/users/github-actions[bot]/events?per_page=100&page=2 - gives 100 events
https://api.github.com/users/github-actions[bot]/events?per_page=100&page=3 - gives 85 to 95 events (rarely 100)
The time difference between the first event on page 1 and the last event on page 3 is less than 5 minutes. At this rate of the account's activity, I should be able to get the latest 300 events, but I am not getting it.
Kindly let me know if anyone knows a reason for this and/or a workaround to get all the events.
Thank you.

Related

Is it possible to get more than 30 messages from stocktwits?

I saw on Stocktwits API documentation that they only allow to get mose recent 30 messages from stream. However, I need to extract messages by searching for certain user or certain symbols from stocktwits by specifying certain periods, etc. Jan,2017-Jan,2019. For example i want to extract all the messages sent by A user from 2017-2019 or all the messages with AAPL tag from 2017-2019. Is it possible?

How does pagination affect rate limit

I am looking at http://stocktwits.com/developers/docs/parameters and am wondering if anyone has used pagination before.
The doc says there is a limit of 800 messages, how does that interact with the request limit? Could I in theory query 200 different stock tickers every hour and get back (up to) 800 messages?
If so that sounds like a great way to get around the 30 message limit.
The documentation is unclear on this and we are rolling out new documentation that explains this more clearly.
Every stream request will have a default and max limit of 30 messages per response, regardless of whether the cursor params are present or not. So you could query 200 different stock streams every hour and get up to 6,000 messages or 12,000 if sending your access token along with the request. 200 request per hour for non authenticated requests and 400 for authenticated requests.

SoundCloud API Add to Group 429 Error

Is there a limit on how ofter a user can remove and re-add their song to a group (or just the general number of connections in general), say per minute/hour/day etc... I ask as I have created a script which automatically removes and re-adds all 5 of my songs within the same 75 groups, however before 1 cycle completes I get the 429 error and seem to be blocked for the day.
Yes there is a limit. The HTTP 429 status code indicates:
The user has sent too many requests in a given amount of time.

GraphAPIError: (#613) Calls to mailbox_fql have exceeded the rate of 300 calls per 600 seconds

I'm writing an application that access Facebook's inbox every 30 seconds.
The first few calls work - but after that I keep getting the "GraphAPIError: (#613) Calls to mailbox_fql have exceeded the rate of 300 calls per 600 seconds." error.
There's no way I'm accessing the inbox 300 times in under 10 minutes.
Why is this happening?
Are you accessing multiple accounts? That rate limit is per application not per user.

Whats the batch request limit for Facebooks Graph API?

Does anyone know whats the limit for batch requests when using FBs graph API?
From their documentation:
Limit
We currently limit the number of batch requests to 20.
Source
That's not clear. Is that 20 per 600 seconds? Per day? Total for one app ever?
It is 50 now. 50 request/batch
It means that 20 individual requests are allowed to be batched together into a single batched request, which saves you from sending 20 individual http requests over at the same time.
If you have more than twenty (20) you can build an array and then break them into groups of 20 or less and loop them thru your PHP in one "session". We have one app that will run 600 - 700 OG requests but it is sloooooooow, up to 300 seconds, some times, depending on FB.