In the documentation, it states that the API is limited to one email per user, and that we should create threads and process multiple users at once.
Does any one know if the is any other type of limitation? How many GB/Hour?
I have to plan a migration tens of thousands of accounts, hardware resources is practically unlimited, will I reaise a flag somewhere or get blocked if I start migrating over 1,000 users at a time?
Thanks
The limits for the API are posted at https://developers.google.com/google-apps/email-migration/limits. There is a per-user rate limit in place of one request per second per user. If you exceed this you will start seeing 503 errors returned. The best way to deal with this is to implement an exponential backoff algorithm to handle the errors and retry the request.
Related
When using GoogleCloudStorageComposeOperator in Google's Cloud Composer we've started hitting TooManyRequests, HTTP 429.
The rate of change requests to the object path/file.csv exceeds the rate limit. Please reduce the rate of create, update, and delete requests.
What limit are we hitting? I think it's this limit but I'm not sure:
There is a write limit to the same object name of once per second, so rapid writes to the same object name won't scale.
Does anyone have a sane way around this issue? On retry it usually works, but would be neat to not rely on it working on retry.
It's hard to say, without details, but this is rather Storage than Composer Issue. It is described in Troubleshooting guide for Cloud Storage.
There you can find some more references to dig more about it. On Quotas and Limit page I have found:
When a project's bandwidth exceeds quota in a location, requests to
affected buckets can be rejected with a retryable 429 error or can be
throttled. See Bandwidth usage for information about monitoring your
bandwidth.
It seems that this error is intended to be retried, so I think implementation of try/catch mechanism might be a solution.
Is there any way to get all contributors from organisation on GitHub using GitHub API or any external service?
I am trying to get all contributors from angular organisation using GitHub API.
I've found only one solution:
Get all repos from angular organisation using this request:
GET https://api.github.com/orgs/angular/repos
For each repo, get all its contributors by this request:
GET https://api.github.com/repos/angular/:repo/contributors
Merge all derived data to one array.
It seems to work, but I think this solution very cumbersome. I'm sending around 300 requests this way, and they are processing around 20 seconds(app will be frozen until all requests are not finished).
Questions:
Are there any alternatives to this approach?
Is it ok for github registered app to handle such many requests? I mention, these 300 requests are sending each time application starts.
Are there any alternatives to this approach?
No, not really -- I can't think of a better approach for this.
Is it ok for github registered app to handle such many requests? I mention, these 300 requests are sending each time application starts.
You should be fine as long as you respect the primary and secondary GitHub API rate limits.
https://developer.github.com/v3/#rate-limiting
https://developer.github.com/guides/best-practices-for-integrators/#dealing-with-abuse-rate-limits
The primary limits allow you to make 5000 authenticated requests per hour per user. The secondary limits will be triggered if you start making lots of concurrent requests (e.g. hundreds of requests per second for more than several second). So, you should be fine if you need to make 300 requests, just make sure you dial down the concurrency.
It would be even better if the application cached some of this information so that it can make conditional requests:
https://developer.github.com/v3/#conditional-requests
I am using the beta endpoint Office365 Outlook REST API to synchronize a large Office365 Outlook folder, see doc here.
The response is paginated... and after many calls to the first synchronization of this big folder, I received this error:
{"error":{"code":"LocalTime","message":"This operation exceeds the throttling budget for policy part 'LocalTime', policy value '0', Budget type: 'Ews'. Suggested backoff time 299499 ms."}}
Looks like I have requested too much the API. What is the best way to handle it? Should I implement some kind of retry policy?
Yes, this is our current throttling mechanism, which is a temporary measure while our "real" throttling implementation is being deployed. To handle this, you'll need to do a retry after about 5 minutes.
I am the manager of an iOS application and it uses Google Places API. Right now I am limited to 100,000 requests and during our testing, one or two users could use up to 2000 requests per day (without autocomplete). This means that only about 50 to 200 people will be able to use the app per day before I run out of quota. I know I will need to fill out the uplift request form when the app launches to get more quota but I still feel that I will need a very large quota based on these test results. Can anyone help me with this issue?
Note: I do not want to launch the app until I know I will be able to get a larger quota.
First up, put your review request in sooner rather than later so I have time to review it and make sure it complies with our Terms of Service.
Secondly, how are your users burning 2k requests per day? Would caching results help you lower your request count?
I'm facing the same problem!
Is it possible to use Places library of the Google Maps Javascript API which gives the quota on each end user instead of an API key so that the quota will grow as user grows. See here
Theoretically I think it's possible to do that since it just need a webView or javascript runtime to use the library, but didn't see anyone seems to use this approach.
I'd like to get every status update for every friend. Given I have say 500 friends, each with 200 statuses, this could be 100,000 statuses. How would you approach this from the query point of view?
What query would you write? Would Facebook allow this much data to come through in a single go? If not is there a best practice paging or offsetting solution?
Would Facebook allow this much data to come through in a single go?
No. Facebook will throw exception of too much data. Also there is automated system in place which will block time-consuming requests as well as it will block your app if it is making too much queries too frequently on a single table - API Throttling Warnings.
If not is there a best practice paging or offsetting solution?
You can do paging in FQL and when querying connections in graph. It is best practice.
From their policy:
If you exceed, or plan to exceed, any of the following thresholds please contact us as you may be subject to additional terms: (>5M MAU) or (>100M API calls per day) or (>50M impressions per day).
http://developers.facebook.com/policy/
It means that 100k is not so big deal. However, it depends. You may have to consider,
Do you REALLY need every status?
Can't they be downloaded later?
Do you need these posts/stories from every friend?