Is there a way to notify the client that they are reaching their limit on their usage plan?
Looked at Cloud Watch, unfortunately, doesn't provide usage down to a key level
One approach is to write a scheduled Lambda function to send notifications to consumers by getting the usage quota using AWS SDK for API Gateway by invoking getUsagePlan method.
Related
Based on this link, there are four policies for denying requests in the WSO2 API Manager.
Block calls to specific APIs
Block all calls from a given application
Block requests coming from a specific IP address
Block a specific user from accessing APIs
Based on this link, The following keys can be used to create custom rate limiting policies (with Siddhi query language):
resourceKey
userId
apiContext
apiVersion
appTenant
apiTenant
appId
clientIp
I need to deny requests within specific time limits. Maybe particular hours or some specific days. Is there a way to do that?
You can check on Siddhi Query functions to develop your custom Rate Limiting policy in the WSO2 API Manager to perform the limiting within a specified timeframe.
Also, the mentioned requirement can be achieved by developing a Global Synapse Handler and engaging it with the API Manager server. The Synapse Handler implementation is done using Java and will get engaged with each API call invocation.
Siddhi Cron
Synapse Handlers in WSO2 API Manager
I'm working with a service that will forward data to a URL of your choosing via HTTP POST requests.
Is there a simple way to publish to a Pubsub topic with a POST? The service I'm using (Hologram.io's Advanced Webhook Builder) can't store any files, so I can't upload a Google Cloud service account JSON key file.
Thanks,
Ryan
You have 2 challenges in your use cases:
Format
Authentication
Format
You need to customize the webhook to comply with the PubSub format. Some webhoock are enough customizable for that but it's not the case of all. If you can't customize the webhook call as PubSub expect, you need to use an intermediary layer (Cloud Functions or Cloud Run for example)
Authentication
Directly to PubSub or with an intermediary layer, the situation is the same: the requester (the webhook) needs to be authenticated and authorized to access to the Google Cloud service.
One of the bad, and possible, practice, is to set allUsers authorized to access your resources. Here an example with a PubSub topic
Don't do that. Even if you increase "your" process security by defining a schema (and thus to reject all the messages that aren't compliant with this schema), letting a resource publicly, and without authentication, accessible on the wild internet is criminal!
In the webhook context (I had this case previously in my company) I recommend you to use a static authentication (a long lived authentication header; not a short lived (1h) as a Google OAuth2 token); an API Key for example. It's not perfect, because in case of API Key leak, the bad actors will be able to use this breach for a long time (rotate as soon as you can your API Keys!), but it's safer than nothing!
I wrote a pretty old article on this use case (with ESPv2 and Cloud Run), but the principle, and the configuration, is almost the same on API Gateway, a Google Cloud manage services. In the article, I create a proxy for Cloud Run, Cloud Functions and App Engine, but you can do the same thing with PubSub by setting the correct target URL.
I am using google cloud pubsub and I want to know about how to get number of outstanding, delivered and undelivered messages in pubsub. Is there any available api provided by google pubsub for this?
You'll want to look at Stackdriver Monitoring. In particular, there are metrics for Google Cloud Pub/Sub, including subscription/num_undelivered_messages and subscription/num_outstanding_messages. You can also access graphs of these properties in Stackdriver.
I am developing a messaging app, using Google App Engine (GAE) and Google Cloud Messaging (GCM). GCM has no usage limits, but calling its API from my backend in GAE, uses the URLFetch API, which has a daily limit of 172,800,000 calls. I like thinking big: If I had 200 million users, not all of them would be able to send just one message.
I have tried using Apache HttpClient instead of the GCM provided sender class, but it uses sockets internally, consuming the Google API Sockets instead, which has a daily limit of 663,552,000. My imaginary 200 million users could just send 3 messages each...
So, is there any way to call the GCM API from GAE that don't consume any hard capped quota?
Thanks in advance.
Does anyone know if there is a way to increase the quota limit of 10 queries when batching calls to the core reporting API?
This question/answer mentions the limit of 10: How can I combine/speed up multiple API calls to improve performance?
If I try to add more than 10 queries to the batch only the first ten are processed, each one after that contains a 403 quota exceeded error.
Is there a pay option? Would love to speed up the process of reporting on GA data for a bunch of URLs. I looked in my Google Developer's Console under the Analytics API where there is an option to increase the per-user limit and a link to request additional quota but I don't need total quota to increase, only allowed batch requests.
Thanks!
Quota is the number of requests you are allowed to make to a Google API without requesting permission to access more. Most of the Google APIs have a free quota, a number of requests Google lets you make without asking for permission to make more request. There are project based quotas and user based quotas.
Unless it says other wise APIs Quotas are projects based not user based.
User quota example
Per-user limit 10 requests/second/user
Some Quotas are user based, a user is normally the person that has authenticated the request. Every request sent to google contains information about who is making the request in the form of the IP address where the request came from. If you have your code running on a server the IP address is the same all the time so Google sees it as the same user. You can get around his by adding a random Quotauser to your request this will identify the request based upon different users.
If you send to many requests to fast from the same user you will see the following error.
userRateLimitExceeded The request failed because a per-user rate limit
has been reached.
The best way to get around this is to use QuotaUser in all of your requests, and identify different users to Google. Or just send a random number every time should also work.
Answer: You can't apply for an extension of the flood protection user rate limit. But you can get around it by using QuotaUser.
more info on quotas can be found on Google developers console APIs