How to share a bucket in Google Cloud Storage - google-cloud-storage

Tried sharing a bucket with a colleague
Initially I added the "Storage.Object.Viewer" role, and sent the link https://console.cloud.google.com/storage/browser/bucket_name/
However on opening the link the following error was received:
You need the storage.objects.list permission to list objects in this
bucket. Ask a project or bucket owner to give you this permission and
try again.
I added more roles, and finally gave admin rights, but kept getting the same error.
How can I share a bucket with all files? specifically I would like to share with read-only permissions

Although a solution has been discovered for this issue, I'm going to summarise some relevant information which may be useful to someone who stumbles on a similar issue.
Project information isn't required in requests to storage buckets, because bucket names are required to be globally unique on Google Cloud Platform, which means if you specify a bucket name in any request, the request will point to the correct bucket no matter what project it resides within, so permissions for a given user to access that bucket must have been set-up in some capacity.
To allow users to list objects in a bucket, they must have been assigned a role with the storage.objects.list permission. Minimal role's that allow the listing of objects in buckets include:
Storage Object Viewer
Which allows users to view objects and their metadata, except for ACLs. They can also list the objects in a bucket.
Project Viewer
This roles also provides users permission to view other resources in the project. In terms of Cloud Storage, users can list buckets. They can also view bucket metadata, excluding ACLs, when listing.This role can only be applied to a project.
There are other storage specific roles which allow users to list objects in buckets, but to also have other authorisation, for example, to edit/create/delete objects. They include:
Storage Object Admin
Users have full control over objects, including listing, creating, viewing, and deleting objects.
Storage Admin
Users have full control of buckets and objects.
For more information on Cloud Storage IAM Roles please see here.

Assuming the google account used to access the URL by your colleague is the one you gave permissions to, you need to also grant "Viewer" role at the project level else he wouldn't be able to login to the GCP console and access the bucket.

Related

Google Cloud Services not giving me permission to view a bucket I've just created?

I am an organisation of one person, just me. I've been using GCS with no problem for a few years. Today I've created a new bucket, and am currently using gsutil to populate it, with no obvious problems.
In the GCS web app I've just tried to click into the bucket via the Storage browser, just to verify it was being populated, and was told
Additional permissions required to list objects in this bucket: Ask a project or bucket owner to grant you 'storage.buckets.list' permissions (e.g. by giving your account the IAM Storage Object Viewer role).
Ok... but I created it? Whatever, I'll click on the menu button (three vertical dots) next to the bucket name and select Edit bucket permissions.
You do not have permission to view the permissions of the selected resource
Right...
Any ideas?!
You figured it out based on your comments. To reduce future guesswork, a really good reference exists for figuring out what roles get which permissions.

GSuite Permissions on Google Cloud Storage

Initial Question
I'm trying to do something that I think is somewhat simple, but I can't seem to get it nailed down correctly. I've been trying to create a bucket on GCS that is accessible to anyone in my GSuite organization, but not the larger internet.
I've created an org#mydomain.com group and added all users. I then granted that user permission to view the file in the bucket, but it always says access denied. If the file is marked public then it's accessible without issue. How do I get this setup?
Additional Information
I have transferred the project and bucket to my organization
I have setup the index and 404 pages
If marked public, everything works as expected
When I check the permissions of individual files, I don't see anything inherited or more specific than the general project security settings.
I added the Storage Object Viewer permission to the bucket for my org#domain.com group
When trying to access a file, I get the following response:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous caller does not have storage.objects.get access to compliance.microcimaging.com/test_xray.jpg.
</Details>
</Error>
So, thinking that it might be thinking I was using a different account, I opened an Incognito Window, logged in as my organization, then attempted to access. That gave me the same message.
I tried adding the org#domain.com user to a single file, which resulted in the same error. I then attempted to add my personal username to the file, which resulted in the same error.
Permission errors have got to be the MOST BORING errors!
Seeing that you already created a Google group you can accomplish this quite easily.
On Google Cloud Platform Console go to "Storage -> Browser", and on your bucket, on the menu on the right select "edit bucket permissions".
On "Add members" put org#mydomain.com and give the role of "Storage -> Storage Object Viewer" to give the whole group read only permissions when authenticated or any other permission combination you need.
Alternatively see this documentation about how to set IAM policies on a Gsuite domain, so you can even skip the group part and set access control policies to the Google Cloud products for your domain as a whole.

Google Cloud storage doesn't show the bucket in browser for a user who has access to it

In our project, we have a group of people which should have full access to ONLY a bucket and they should not see other buckets or the object on the other buckets.
so, i changed the permission of the bucket, and i added the users as Storage Admin for that specific bucket (not for whole project).
In this case, when they use console/Storage they see the following message:
But when they open cloud Shell and they use Gsutil, they can access to the bucket objects (no access to other buckets).
Is this a bug on the interface of Console/storage?
This is not a bug, but it is a subtlety of the Console. In order to access a bucket from the Console, you typically navigate to it using the Browser, which is what appears you attempt in the screenshot. This fails, though, because to do this you need permission to list buckets for a project, even if you otherwise have free reign to work within the bucket.
There are three ways to deal with this:
1) Give your users the Viewer permission for the project that contains the bucket. There are pros and cons to this. I'd say it's probably not worth going this route (though not as much because your users will see other buckets - bucket namespace is publicly viewable anyway - but because doing so brings up some additional permission nuances you probably don't want to deal with).
2) Link directly to the desired bucket, thus avoiding the "listing buckets" portion of the Console. The URL for a bucket has the form: console.cloud.google.com/storage/browser/[BUCKET_NAME]. I believe this will work without any additional modifications to your permissions.
3) Create a custom role that only contains the storage.buckets.list permission, and use that role on the project for affected users.

Idiomatic Way to Secure a Google Storage Bucket

My objective is to grant read-write permissions on a Google Storage Bucket to a Compute Instance Template in a way that grants only the permissions that are necessary, but I'm confused about what's considered idiomatic in GCP given the many access control options for Google Storage Buckets.
Currently, I am creating a Managed Instance Group and a Compute Instance Template and assigning the following scopes:
https://www.googleapis.com/auth/userinfo.email
https://www.googleapis.com/auth/compute.readonly
https://www.googleapis.com/auth/devstorage.read_write
to the default Service Account on the Compute Instance. This seems to work fine, but given the link above, I'm wondering if I should explicitly set the Access Control List (ACL) on the Storage Bucket to private as well? But that same page also says "Use ACLs only when you need fine-grained control over individual objects," whereas in this case I need a coarse-grained policy. That makes me wonder if I should use an IAM Permission (?) but where would I assign that?
What's the idiomatic way to configure this?
It turns out the key documentation here is the Identity and Access Management overview for Google Cloud Storage. From there, I learned the following:
GCS Bucket ACLs specify zero or more "entries", where each entry grants a permission to some scope such as a Google Cloud User or project. ACLs are now considered a legacy method of assigning permissions to a bucket because they only allow the coarse-grained permissions READER, WRITER, and OWNER.
The preferred way to assign permissions to all GCP resources is to use an IAM Policy (overview). An IAM Policy is attached to either an entire Organization, a Folder of Projects, a specific Project, or a specific Resource and also specifies one or more "entries" where each entry grants a role to one or more members.
With IAM Policies, you don't grant permissions directly to members. Instead, you declare which permissions a role has, and grant members a role.
Ultimately, the hope is that you assign IAM Policies at the appropriate level of the hierarchy, knowing that lower levels of the hierarchy (like individual resources) inherit the permissions declared by the IAM Policies at higher levels (like at the Project level).
Based on this, I conclude that:
You should try to assign permissions to a GCS Bucket by assigning IAM Policies at the right level of the hierarchy.
However to limit permissions on a per-object basis, you must use ACLs.
When a Bucket is newly created, unless you specify otherwise, it is defined the default Canned ACL of projectPrivate.
As of this answer, Terraform does not yet have mature support for IAM Policies and the google_storage_bucket_acl resource represents an interface to a legacy approach to securing a Bucket.
Caveat: I'm only summarizing the docs here and have very limited practical experience with Google Cloud so far! Any corrections to above are welcome.

How to setup granular privileges on GCS?

I am working on a project where I would like a developer to have access to read/write to GCS, but not necessarily have access to uploading code in App Engine. I don't see options in the web console for specifying rights access. How can I setup specific privileges that I'd like a user to have? Thanks.
Basically, if you want a team member not to be allowed to deploy the application and modify or configure its resources, him/her must have only "can View" access level for a project.
Then you have to set the respective permission (WRITE) for a bucket scoped to "Google account email address" (email address of your developer) in you case.
As GCS documentation says, there are three ways to specify ACLs to buckets and objects, using:
The acl query string parameter to specify ACLs for certain scopes (here)
The x-goog-acl request header to specify predefined ACLs (here)
The defaultObjectACL query string parameter to change the default object ACL for all objects in a certain bucket (here)