I'm trying to list all buckets on which I've been granted access thanks to gsutil, but it seems gsutil only list buckets on which my project has been granted access. Any idea?
There is not currently a way to do this.
gsutil (and the API) can only list buckets within a single project per call.
Related
I am an organisation of one person, just me. I've been using GCS with no problem for a few years. Today I've created a new bucket, and am currently using gsutil to populate it, with no obvious problems.
In the GCS web app I've just tried to click into the bucket via the Storage browser, just to verify it was being populated, and was told
Additional permissions required to list objects in this bucket: Ask a project or bucket owner to grant you 'storage.buckets.list' permissions (e.g. by giving your account the IAM Storage Object Viewer role).
Ok... but I created it? Whatever, I'll click on the menu button (three vertical dots) next to the bucket name and select Edit bucket permissions.
You do not have permission to view the permissions of the selected resource
Right...
Any ideas?!
You figured it out based on your comments. To reduce future guesswork, a really good reference exists for figuring out what roles get which permissions.
Tried sharing a bucket with a colleague
Initially I added the "Storage.Object.Viewer" role, and sent the link https://console.cloud.google.com/storage/browser/bucket_name/
However on opening the link the following error was received:
You need the storage.objects.list permission to list objects in this
bucket. Ask a project or bucket owner to give you this permission and
try again.
I added more roles, and finally gave admin rights, but kept getting the same error.
How can I share a bucket with all files? specifically I would like to share with read-only permissions
Although a solution has been discovered for this issue, I'm going to summarise some relevant information which may be useful to someone who stumbles on a similar issue.
Project information isn't required in requests to storage buckets, because bucket names are required to be globally unique on Google Cloud Platform, which means if you specify a bucket name in any request, the request will point to the correct bucket no matter what project it resides within, so permissions for a given user to access that bucket must have been set-up in some capacity.
To allow users to list objects in a bucket, they must have been assigned a role with the storage.objects.list permission. Minimal role's that allow the listing of objects in buckets include:
Storage Object Viewer
Which allows users to view objects and their metadata, except for ACLs. They can also list the objects in a bucket.
Project Viewer
This roles also provides users permission to view other resources in the project. In terms of Cloud Storage, users can list buckets. They can also view bucket metadata, excluding ACLs, when listing.This role can only be applied to a project.
There are other storage specific roles which allow users to list objects in buckets, but to also have other authorisation, for example, to edit/create/delete objects. They include:
Storage Object Admin
Users have full control over objects, including listing, creating, viewing, and deleting objects.
Storage Admin
Users have full control of buckets and objects.
For more information on Cloud Storage IAM Roles please see here.
Assuming the google account used to access the URL by your colleague is the one you gave permissions to, you need to also grant "Viewer" role at the project level else he wouldn't be able to login to the GCP console and access the bucket.
In our project, we have a group of people which should have full access to ONLY a bucket and they should not see other buckets or the object on the other buckets.
so, i changed the permission of the bucket, and i added the users as Storage Admin for that specific bucket (not for whole project).
In this case, when they use console/Storage they see the following message:
But when they open cloud Shell and they use Gsutil, they can access to the bucket objects (no access to other buckets).
Is this a bug on the interface of Console/storage?
This is not a bug, but it is a subtlety of the Console. In order to access a bucket from the Console, you typically navigate to it using the Browser, which is what appears you attempt in the screenshot. This fails, though, because to do this you need permission to list buckets for a project, even if you otherwise have free reign to work within the bucket.
There are three ways to deal with this:
1) Give your users the Viewer permission for the project that contains the bucket. There are pros and cons to this. I'd say it's probably not worth going this route (though not as much because your users will see other buckets - bucket namespace is publicly viewable anyway - but because doing so brings up some additional permission nuances you probably don't want to deal with).
2) Link directly to the desired bucket, thus avoiding the "listing buckets" portion of the Console. The URL for a bucket has the form: console.cloud.google.com/storage/browser/[BUCKET_NAME]. I believe this will work without any additional modifications to your permissions.
3) Create a custom role that only contains the storage.buckets.list permission, and use that role on the project for affected users.
I am storing images of one user(owner) in google cloud storage bucket. I wanted to grant read permission for this image to a group of users(contacts of owner).I am planning to use Access Control List for this purpose; e.g., Owner will have full permission to his bucket and the contacts will have read permission on the images. There are chances that owner will have a very huge number of contacts, say 1 million.
So,
will there be any performance issue, if ACL contains a huge number of users?
Will this be the right approach for access control? Or should I consider signed URL?
Regards,Remya
This approach is not going to work for you. There are some significant limitations and downsides to trying to serve content like this. First and foremost, there is a limit of 100 ACL entries on a given object. You could get around this by granting permission to a group for which every user was a member, but even so, it still means that viewing the images will require that every user be logged in to their Google account in addition to however they authenticate for your site.
The canonical way to accomplish this would be to keep all images private and owned by your site's own account. When a user loads a page, verify however you like that they have appropriate authorization to view the images, and if so, generate signed URLs for the images. This allows you to use any authorization scheme without limitation while serving images directly from GCS.
I am evaluating google cloud storage for following use case. I need restrict my users (they do not have gmail accounts) so they can access only their files.
I know that can be done using gsutil signurl. But its gonna be lots of small files and generating signed url for every file is crazy. So wondering if there is trick to provide access to some subfolder using signed url?
Mentioned documentation says that wildcards can be used. Does it mean that it will generate many urls or one url that will apply to all files within wildcard?
You should use per-object ACL for this, absolutely. Signed URLs might be more difficult to implement, and if you're already thinking of managing user accounts, you'll want to do this through OAuth2.0 for Login anyways, so sending the user's Bearer token with any requests you make to the API should come as a magical bonus of doing your user accounts in this way. Read more about Auth with Cloud Storage here.
Unlike the gsutil ls command, the signurl command does not support operations on sub-directories. For example, unless you have an object named some-directory/ stored inside the bucket some-bucket, the following command returns an error: gsutil signurl gs://some-bucket/some-directory/