Google Cloud Storage authentication: restrict permissions without creating additional Google Accounts - google-cloud-storage

I want to backup multiple servers using Google Cloud Storage. Each server needs access to the bucket containing his backups. A server should not have access to the bucket of any other server.
I have used Amazon S3 before and simply created one user per server in IAM and assigned a policy to the user that allows accessing a specific bucket.
On Google Cloud Storage it seems like Authentication is based on Google Accounts and OAuth 2.0 so every server uses the same Google Account (mine) and the result is that every server has full access to all buckets.
How to give each server his own access credentials (that has access to his own bucket only) without the need to create a new Google Account for each server?

You can create/use service accounts for each bucket and use them to authenticate access to storage bucket from your Google Compute Engine instance. You can limit the access you your bucket by changing the bucket ACLs to allow the access to service account only.

Related

Access specific folder in GCS bucket according to user, using Workload Identity Federation

I have an external identity provider that supports OpenID Connect (OIDC) and want to access Google Cloud Storage(GCS) directly, using a short-lived access token. So I'm using workload identity federation in order to provide a credential from my external identity provider and get a federated token in exchange.
I have created the workload identity pool and provider and connected a service account to it, which has write access to a certain bucket in GCS.
How can I differentiate the access to specific folder in the bucket according to the token provided from my external identity provider? For example for userA to have access only to folderA in the bucket. Can I do this using one service account?
Any help would be highly appreciated.
The folders don't exist on Cloud Storage, it's a blob storage, all the object are stored at the bucket level. For human readability and representation, the / are the folder separator, by convention.
Therefore, because directory doesn't exist, you can't grant any permission on it. The finer granularity is the bucket.
In your use case, you can't grant a write access at folder level, but you can create 1 bucket per user and therefore grant the impersonated service account on the bucket.

How to access S3 bucket using Flutter Amplify without authentication from AWS cognito

I've created an S3 bucket and made its access level to public. I don't have AWS Cognito configured with the project. I need to use amplify_storage_s3 to get and put files to the bucket without a cognito userpool. Is this possible?
As per the official response here, it does not support.
currently it's not possible to use S3 plugin without Cognito as it is used to sign the S3 requests. You can enable guest access while configuring Auth through CLI such that your app's users wouldn't have to sign in to use S3 resources (by using the guest access)
But I found a probably workaround here

Access google cloud data without GCP account

I have created a bucket and files in it with Google Cloud Storage. I have also edited the permissions of the bucket to allow access to persons within a Googlegroup account.
Now, if they need to access the data, do they need to "sign up" at the Google Cloud Platform?
Is there anyway they can copy all the files in the bucket using gsutil again without GCP account?
No, there's not a way to allow access to a bucket in that way, but Google accounts and GCP accounts are the same, so anyone with a gmail account could access it.
The closest thing to your use case is Signed URLs, which can grant access to individual objects.

Hiding Buckets in Google Cloud Storage

We've just moved files off of a 10 year old FTP server and are now using Google Cloud Storage for our company's files. This was setup to use the web-hosting feature of GCS, and the access logging capability was also enabled.
The access logs are dumped into a bucket. I would like to deny access to this bucket, but allow them to continue using the main GCS bucket (we use Cyberduck for this).
We are presently allowing anybody with our company's email address to read/write to the buckets in this project, by giving the "Domain" the "Storage Admin" and "Storage Object Admin" permissions. This was granted through the IAM permissions.

Authorizing GCE to Access GCS

I have a django app running in my Google Compute Engine, and it needs to upload video files to my bucket in Google Cloud Storage. When searching for authentication methods, I found this doc. Under Setting the scope of service account access for instances section, it says I need to enable the Cloud Platform access in the settings when creating the VM. I wonder if it is a must and if there's any other way that I can access my cloud storage bucket from my apps in the compute engine. Because creating a new VM and set up the environment is very time-consuming. Any input would be greatly appreciated. Thanks in advance.
As documented on the page you linked to, to authenticate from Google Compute Engine to Google Cloud Storage, you have several options:
Use VM scopes: this must be set before creating the VM, because scopes are immutable once the VM is created. If you want read-only access, you need to add the scope devstorage.read_only (short form) or https://www.googleapis.com/auth/devstorage.read_only (full path). If you want read-write access, you should use the scope devstorage.read_write (short form) or https://www.googleapis.com/auth/devstorage.read_write (full path).
Note: there's also a feature gcloud beta compute instances set-scopes to update GCE VM scopes at runtime.
An alternative to using scopes is to use JSON authentication tokens, such as via Service accounts which can be used by Google API client libraries to connect to Google Cloud Storage.