I use GCS bucket to upload content. I am distributing a script to my users that helps them download the content in GCS bucket to their local directory. Each of the users are also GCP project owners.
How do I set permissions in GCS bucket to enable only selected GCP projects to access the contents?
Thanks
In order to give access to users in a different project to Cloud Storage bucket you can edit the bucket permissions and can add "Owners, Editors or Viewers" of that project to bucket ACLs e.g. If you want owners of project A to access the bucket with full permissions in project B you can change the permissions of the bucket and add "Project owners- . By doing that all the owners of the project A will have full control on the bucket in project B.
Note: If you change the ACLs of the bucket the changes will apply to new objects uploaded to the bucket. The object already in the bucket will still have the old ACLs.
You can read more about bucket and object ACLs on these links [1]:
[1] https://cloud.google.com/storage/docs/accesscontrol
Related
How to transfer GCS bucket from one account to another account without downloading data
Is Transfer Service for Cloud Data Chargable?
You don't transfert GCS bucket from an account to another one. The GCS bucket belong to a project.
You can grant new user on the project, on only on the bucket to allow them access. You can also create another bucket, in another project, with another name (project id and bucket name are global resources, 2 can't have the same name all around the world) and use Transfer service to duplicate the data. The service is free of charge if the data stay in the same region (if not, egress cost will be applied)
I am trying to create a transfer job in Data Transfer, to copy all files in a bucket belonging to one account to an existing bucket belonging to another account.
I get access to both source and destination buckets, I get "green light" in the wizard, but when I try to run the transfer job I get the following error message:
To complete this transfer, you need the 'storage.buckets.setIamPolicy'
permission for the source bucket. Ask the bucket's administrator to
grant you the required permission and try again.
I have tried to apply various roles to the user runnning the transfer job, but I can't figure out how to overcome this problem.
Can anyone help me on this?
This permission storage.buckets.setIamPolicy can be granted with either roles/storage.legacyBucketOwner or roles/iam.securityAdmin role. It could be needed to keep the permissions applied to the source object.
Permissions for copying an object:
storage.objects.create (for the destination bucket)
storage.objects.delete (for the destination bucket)
storage.objects.get (for the source object)
storage.objects.getIamPolicy (for the source object)
storage.objects.setIamPolicy (for the destination bucket)
Please see:
Cloud IAM > Documentation > Understanding roles
Cloud Storage > Documentation > Reference > Cloud IAM roles
I am trying to create a gcs bucket using this link: https://cloud.google.com/storage/docs/creating-buckets, but when I click on "Open The Cloud Storage Browser", it asks me to select a project. I want the bucket content to be publicly available. Any idea how can I do that? I dont want my bucket to be associated with a project, instead it should be in a global namespace. TIA.
Creating a bucket requires specifying a project.
Making the bucket content publicly accessible is a separate matter - you can set ACLs or IAM policies on the bucket that grant public access.
I have an local active directory server and a few shared folders with permissions in that the created users from the active directory have an account that has those files saved in a google cloud storage, i would like to know if there is a way to use the files from google cloud so that the admin is able to manage the permissions and users from the active directory. If not, what other solutions can i use?
Thanks!
A project, a Google Group have been set up for controlling data access following the DCM guide: https://support.google.com/dcm/partner/answer/3370481?hl=en-GB&ref_topic=6107456
The project does not contain the bucket I want to access(under Storage->Cloud Storage), since it's Google owned bucket, for which I only have read only access. I can see the bucket in my browser since I am allowed to with my Google account(since I am a member of the ACL).
I used the gsutil tool to configure the service account of the project that was linked with the private bucket using
gsutil config -e
but when I try to access that private bucket with
gsutil ls gs://<bucket_name>
I always get 403 errors, and I don't know why is that. Did anyone tried that before or any ideas are welcome.
Since the bucket is private and in project A, service accounts in your project (project B) will not have access. The service account for your project (project B) would need to be added to the ACL for that bucket.
Note that since you can access this bucket with read access as a user, you can run gsutil config to grant your user credentials to gsutil and use that to read the bucket.