Access specific folder in GCS bucket according to user, using Workload Identity Federation - google-cloud-storage

I have an external identity provider that supports OpenID Connect (OIDC) and want to access Google Cloud Storage(GCS) directly, using a short-lived access token. So I'm using workload identity federation in order to provide a credential from my external identity provider and get a federated token in exchange.
I have created the workload identity pool and provider and connected a service account to it, which has write access to a certain bucket in GCS.
How can I differentiate the access to specific folder in the bucket according to the token provided from my external identity provider? For example for userA to have access only to folderA in the bucket. Can I do this using one service account?
Any help would be highly appreciated.

The folders don't exist on Cloud Storage, it's a blob storage, all the object are stored at the bucket level. For human readability and representation, the / are the folder separator, by convention.
Therefore, because directory doesn't exist, you can't grant any permission on it. The finer granularity is the bucket.
In your use case, you can't grant a write access at folder level, but you can create 1 bucket per user and therefore grant the impersonated service account on the bucket.

Related

Data Transfer between Google Storage different Service Accounts

I have two Google Service Credentials and a bucket on each account .I have to transfer files from one bucket to another. How can I do this programmatic ally?
Can I achieve this with two Storage objects or using the Cloud storage Transfer service?
Yes, with Storage Transfer Service you can create a transfer job and send the data to a destination bucket (in another project), keep in mind that it is documented that:
To access the data source and the data sink, this service account must
have source permissions and sink permissions.
Meaning that you can't use two different service accounts, you will need to grant access to only one of the two service accounts you have.
If you want to transfer files from one bucket to another programmatically. First, you must grant permission to the service account associated with the Storage Transfer Service so it can access the data sink(destination bucket), please follow these steps.
Please note that if you are not creating the transfer job in the same project where the source bucket is located, then you must grant permissions to access it.
With Storage Transfer Service you can create a transfer job programmatically with Java and Python, examples include creating the transfer job and checking the transfer operation status. Full code example can be found for Java and Python.

Access google cloud data without GCP account

I have created a bucket and files in it with Google Cloud Storage. I have also edited the permissions of the bucket to allow access to persons within a Googlegroup account.
Now, if they need to access the data, do they need to "sign up" at the Google Cloud Platform?
Is there anyway they can copy all the files in the bucket using gsutil again without GCP account?
No, there's not a way to allow access to a bucket in that way, but Google accounts and GCP accounts are the same, so anyone with a gmail account could access it.
The closest thing to your use case is Signed URLs, which can grant access to individual objects.

Google Cloud storage: Grant permission to OAuth 2.0 client

I try to download a file from a google cloud drive bucket via the REST. But if I use the access_token of the oAuth 2.0 client which I have created I get "Insufficient Permission" as an error (It works with the access toke of my googel account).
So, where in the cloud platform I can grant the oAuth2 client access to the bucket from where I want to download the file?
Thx
TL;DR - You're most likely missing the step where you request the right scopes when requesting your OAuth2.0 access token. Please look at the supported scopes with Google Cloud Storage APIs. Access tokens typically expire in 60 minutes and you will need to use a refresh token to get a new access token when it expires.
Please read the Google Cloud Storage Authentication page for detailed information.
Scopes
Authorization is the process of determining what permissions an
authenticated identity has on a set of specified resources. OAuth uses
scopes to determine if an authenticated identity is authorized.
Applications use a credential (obtained from a user-centric or
server-centric authentication flow) together with one or more scopes
to request an access token from a Google authorization server to
access protected resources.
For example, application A with an access
token with read-only scope can only read, while application B with an
access token with read-write scope can read and modify data. Neither
application can read or modify access control lists on objects and
buckets; only an application with full-control scope can do so.
Authentication in Google Cloud
Google Cloud services generally provides 3 main modes of authentication:
End User Account credentials - here you authenticate as the end user directly using their google account or an OAuth 2.0 access token. When requesting an access token, you will need to provide the scopes which determine which APIs are accessible to the client using that access token.
OAuth2.0 credentials - if granted the right scope, can access the user's private data. In addition, Cloud IAM lets you control fine grained permissions by granting roles to this user account.
Service Accounts - here you create a service account which is associated with a specific GCP project (and billed to that project thereby). These are mainly used for automated use from your code or any of the Google Cloud services like Compute Engine, App Engine, Cloud Functions, etc. You can create service accounts using Google Cloud IAM.
Each service account has an associated email address (you specify when creating the service account) and you will need to grant appropriate roles for this email address for your Cloud Storage buckets/objects. These credentials if granted the right roles can access the user's private data.
API keys - here you get an encrypted string which is associated with a GCP project. It is supported only by very few Google Cloud APIs and it is not possible to restrict the scope of API keys (unlike service accounts or OAuth2.0 access tokens).

Hiding Buckets in Google Cloud Storage

We've just moved files off of a 10 year old FTP server and are now using Google Cloud Storage for our company's files. This was setup to use the web-hosting feature of GCS, and the access logging capability was also enabled.
The access logs are dumped into a bucket. I would like to deny access to this bucket, but allow them to continue using the main GCS bucket (we use Cyberduck for this).
We are presently allowing anybody with our company's email address to read/write to the buckets in this project, by giving the "Domain" the "Storage Admin" and "Storage Object Admin" permissions. This was granted through the IAM permissions.

Google Cloud Storage authentication: restrict permissions without creating additional Google Accounts

I want to backup multiple servers using Google Cloud Storage. Each server needs access to the bucket containing his backups. A server should not have access to the bucket of any other server.
I have used Amazon S3 before and simply created one user per server in IAM and assigned a policy to the user that allows accessing a specific bucket.
On Google Cloud Storage it seems like Authentication is based on Google Accounts and OAuth 2.0 so every server uses the same Google Account (mine) and the result is that every server has full access to all buckets.
How to give each server his own access credentials (that has access to his own bucket only) without the need to create a new Google Account for each server?
You can create/use service accounts for each bucket and use them to authenticate access to storage bucket from your Google Compute Engine instance. You can limit the access you your bucket by changing the bucket ACLs to allow the access to service account only.