Would I get charged for clients' Cloud Storage JSON API calls? - google-cloud-storage

I built a package in R which basically wraps around the Cloud Storage JSON API. I included a default OAuth app (that is a client id and client secret, see documentation) in the package. The client id and secret are created and hosted in my own cloud platform project with my billing details. The R package uses the OAuth app to ask for end user's authentication before any API calls and stores the token for the end user. Any subsequent API calls are sent with the retrieved token.
I noticed that the stats about the end users' API calls are showing up in my own project because it hosts the OAuth app. In this case, do I get charged for those API calls by end users?

All calls to GCS are always billed to the bucket that they reference. Calls that don't specify a specific bucket, like "list buckets in a project", are billed to the project in question.

Your JAVA JavaScript Structructured Omitts Notations are very local and require a great deal with NAMESPACE as it will resolve quite rampantly if not given a proper address pool I suggest Googles DNS alongside subsequent calls within the given IP zone 10.10.10.10/12 etc ... As higher languages like human language tend to fall outside these zone and need to be delegated ... Might be jumping from 0.0.0.0 to higher address pools without knowing can be a pain.

Related

Pubsub HTTP POST?

I'm working with a service that will forward data to a URL of your choosing via HTTP POST requests.
Is there a simple way to publish to a Pubsub topic with a POST? The service I'm using (Hologram.io's Advanced Webhook Builder) can't store any files, so I can't upload a Google Cloud service account JSON key file.
Thanks,
Ryan
You have 2 challenges in your use cases:
Format
Authentication
Format
You need to customize the webhook to comply with the PubSub format. Some webhoock are enough customizable for that but it's not the case of all. If you can't customize the webhook call as PubSub expect, you need to use an intermediary layer (Cloud Functions or Cloud Run for example)
Authentication
Directly to PubSub or with an intermediary layer, the situation is the same: the requester (the webhook) needs to be authenticated and authorized to access to the Google Cloud service.
One of the bad, and possible, practice, is to set allUsers authorized to access your resources. Here an example with a PubSub topic
Don't do that. Even if you increase "your" process security by defining a schema (and thus to reject all the messages that aren't compliant with this schema), letting a resource publicly, and without authentication, accessible on the wild internet is criminal!
In the webhook context (I had this case previously in my company) I recommend you to use a static authentication (a long lived authentication header; not a short lived (1h) as a Google OAuth2 token); an API Key for example. It's not perfect, because in case of API Key leak, the bad actors will be able to use this breach for a long time (rotate as soon as you can your API Keys!), but it's safer than nothing!
I wrote a pretty old article on this use case (with ESPv2 and Cloud Run), but the principle, and the configuration, is almost the same on API Gateway, a Google Cloud manage services. In the article, I create a proxy for Cloud Run, Cloud Functions and App Engine, but you can do the same thing with PubSub by setting the correct target URL.

Making API requests to a 3rd party that requires authentication

Here is my scenario. Imagine there is a Yoga studio that uses a professional booking and reservation system that exposes an API. Through this API an application can make a reservation for a client. The API takes the client's userid and password to make the reservation. The booking API doesn't use OAuth or any social media sign-ins.
My desire is to create an Assistant Action that would retrieve the list of classes and allow the client to make a booking.
My puzzle is what design/architecture to look towards to supply the userid/password pair required by the booking API.
How have others solved this puzzle?
Should I store the userid/password as "user state" associated with the action?
First, you should have a conversation with the API provider about why they don't provide an OAuth-based solution. This is a security vulnerability waiting to happen, if it hasn't already.
Second, you need to think very carefully about your own risk profile in this case:
Google does not allow you to collect credential information (ie - passwords) through your Action.
Because of this, you must use Account Linking to authenticate them.
This means that you will need something (ie - a database or data store) to manage their account on your side.
This database would be a good place to keep the username/password you need to use for them for the API...
...but it now means that you need to take extreme care about protecting this database.
You don't really say how this API allows for accounts to be created and managed. If these accounts are just used for you (ie - the user doesn't necessarily see them), then you can mitigate some of that risk by treating the username/password as an opaque token that you manage and generate and that the user never sees.
If this is something that the user is aware of, then you'll need to approach the account linking in one of two ways:
Have them log into your service via an app or webapp using this credential info that you will need to save (ack!) and then link to the Assistant using OAuth.
Have them log into your service via an app or webapp using Google Sign-In, which will carry over to your Action. Then have them provide the credential info for the API, which you will need to save (ack!).

Handling User Preferences/States in REST API

We're starting to migrate our Website to a REST Service based system and are in the process of developing the core right now.
In our current setup a user has one or more "accounts" assigned which define what data he can see on the website. Only one account can be active for a given user at any time. Right now we store the selected account in the database and use it to filter all queries.
Now I'm not sure how to handle this properly in a REST environment. Possible solutions I found are:
Sending the requested account with every request
Storing the current account in the auth token. (We're using JWT for that)
Having the current account stored on the server and calling a specific resource to change it
Each of these has its pros and cons for our setup. Currently we're using the 3rd approach in our Website. But what would be the correct way to handle such a thing in a REST environment?
Yea the design you are dealing with is fairly bad, and what you really want to do is remove the state completely out of this system.
For that reason the first option is by far superior:
Sending the requested account with every request
If this is simply an id, there's a very simple way to do this, just prefix all your (relevant) routes / uris with this account id. For example:
http://api.example.org/accounts/{id}/...
This way the 'state' is maintained by virtue of which url you are accessing, and the server can be unaware of the state.

Creating an API Layer on top of Firebase Real-Time Database

I do have some data stored in my Real-Time Firebase database. I am willing to expose some of this data via a REST API to my B2B customers.
I know that Firebase is itself a REST API but its authentication mechanisms don't fit my needs. I am willing my customers to access the API with a simple API Key passed in the HTTP request headers.
To summarize, I need an API layer sitting on top of my Firebase real-time database with the following properties:
Basic Authentication via an API key passed in the HTTP request headers
Some custom logic that makes sure customers respect the API limits (maximum requests per day for example)
The only thing I can think of is implementing this layer in AWS lambda but that also sounds a bit off. From the lambda, I would have to access my Firebase database and serve that data. That seems too many network requests; something native to Firebase would be great.
Thanks,
Guven.
Why not have a simple API which provides them an Oauth token for the original firebase REST API if they have the correct Api Key
It'll be more secure as only you'll be able to make the tokens as only you'll have the service account private key. Also saves you the headache of making a whole REST API. Also the Oauth tokens expire relatively quickly so it's less of a risk than a normal key that you furnish
I personally have created my own Servlets where a user posts their data if they are authenticated using an id pass combo.
In the Servlets i use the default REST API provided by Firebase with the Oauth generated in my servlet. This way, i can have the DB security rules set to false for all writes from any client api. And the REST API and their admin sdk on my server ignore the security rules by default.
After some research, I have decided that AWS is the best platform such API related features.
Gateway API lets you setup your API interface in a matter of seconds
DynamoDB stores your API data; you can easily populate the data here
AWS Lambda lets you write the integration code between Gateway API and DynamoDB
On top of these, the platform offers these features out of the box:
Creation & handling and verification of API keys for authentication
Usage plans to make sure that API consumers don't exceed your API usage limits
Most of what I was looking for is offered in these AWS services.

Limits on Dropbox API when calling from a single account

I'm currently trying to make a website that allows users to host files, so I intend on buying a business Dropbox account for this purpose, generate an Access Token so the app don't go through OAuth authentication and internally serve and upload files to this single account.
Could it be done using a single Dropbox account? What are limits on calling from a single account. All access token logic would be hardcoded.
The Dropbox API does have a rate limiting system, but we don't have any specific numbers documented. It is only designed to prevent abuse though, and is accordingly very generous. Further, the limits operate on a per-user basis. That being the case, you generally don't need to worry about hitting it in normal use. The Dropbox API rate limiting system operates the same regardless of account type.
Also note that not all 429s or 503s indicate rate limiting, but in any case that you get a 429 or 503 the best practice is to retry the request, respecting the Retry-After header if given in the response, or using an exponential back-off, if not.
The API was designed with the intention that each user would link their own Dropbox account, in order to interact with their own files. However, it is technically possible to connect to just one account. The SDKs don't offer explicit support for it and we don't recommend doing so, for various technical and security reasons. Most of these concerns are allayed for server-side apps though.
So, if you did want to go this route, instead of kicking off the authorization flow, you would manually use an existing access token for your account and app, as you mentioned. (Just be careful not to revoke it, e.g. via https://www.dropbox.com/account/security .)