Do iam permissions prevent from viewing objects (images) by pasting their URLs? - google-cloud-storage

I have a basic chat application in GO, where users can create group threads and exchange messages and images which are stored inside these threads. Each thread is associated with its own cloud storage bucket and additionally the URLs of the images in a bucket are saved in Postgres. Right now I've set the permissions so that only the users that belong to a certain thread can view the images stored in its corresponding bucket with the help of this article.
What I don't understand is if a user that doesn't belong to a certain thread (and doesn't have view permissions for that bucket) can he still view the images inside it if he has their URLs?

Related

Google Cloud Services not giving me permission to view a bucket I've just created?

I am an organisation of one person, just me. I've been using GCS with no problem for a few years. Today I've created a new bucket, and am currently using gsutil to populate it, with no obvious problems.
In the GCS web app I've just tried to click into the bucket via the Storage browser, just to verify it was being populated, and was told
Additional permissions required to list objects in this bucket: Ask a project or bucket owner to grant you 'storage.buckets.list' permissions (e.g. by giving your account the IAM Storage Object Viewer role).
Ok... but I created it? Whatever, I'll click on the menu button (three vertical dots) next to the bucket name and select Edit bucket permissions.
You do not have permission to view the permissions of the selected resource
Right...
Any ideas?!
You figured it out based on your comments. To reduce future guesswork, a really good reference exists for figuring out what roles get which permissions.

How to grant/revoke an access to Cloud Object Storage resource automatically?

I have an iOS App. Would like to explore what is needed to be done to achieve the following:
1) The user taps on the map
2) US Census Tract info is requested from database
3) Later the user wants to purchase this tract info.
The US Census Tract info would be uploaded to Cloud Object Storage.
There are 70,000 Tracts grouped by US States = 50 + 1 (DC)
I could use SQL Query to select one Census Tract by its ID.
In the iOS App I can use Apple Login and get users' name and email.
The question is how to grant/revoke access to this info automatically
after in-app purchase?
The question is two-fold. Do I have to create 70,000 CSV files and grant them an access to? Or this can be achieved dynamically with SQL?
The second part is - how to automate this process?
Does IBM Cloud has this capability?
I would expect that you would use a single Service ID that would have access to the data sitting in COS, and that a user's access to the underlying data would be handled in your application logic. The Cloud IAM access policies are not intended for end-users as much as for internal development/operations teams to manage access to various cloud resources.
Depending on the format of the census data, SQL Query could be a great way to do it. You could use SQL query to create a new object with the subset of data the user has requested, and then create a presigned URL that will expire in a whatever timeframe is reasonable, allowing the file to be downloaded to the client device.

How to share a bucket in Google Cloud Storage

Tried sharing a bucket with a colleague
Initially I added the "Storage.Object.Viewer" role, and sent the link https://console.cloud.google.com/storage/browser/bucket_name/
However on opening the link the following error was received:
You need the storage.objects.list permission to list objects in this
bucket. Ask a project or bucket owner to give you this permission and
try again.
I added more roles, and finally gave admin rights, but kept getting the same error.
How can I share a bucket with all files? specifically I would like to share with read-only permissions
Although a solution has been discovered for this issue, I'm going to summarise some relevant information which may be useful to someone who stumbles on a similar issue.
Project information isn't required in requests to storage buckets, because bucket names are required to be globally unique on Google Cloud Platform, which means if you specify a bucket name in any request, the request will point to the correct bucket no matter what project it resides within, so permissions for a given user to access that bucket must have been set-up in some capacity.
To allow users to list objects in a bucket, they must have been assigned a role with the storage.objects.list permission. Minimal role's that allow the listing of objects in buckets include:
Storage Object Viewer
Which allows users to view objects and their metadata, except for ACLs. They can also list the objects in a bucket.
Project Viewer
This roles also provides users permission to view other resources in the project. In terms of Cloud Storage, users can list buckets. They can also view bucket metadata, excluding ACLs, when listing.This role can only be applied to a project.
There are other storage specific roles which allow users to list objects in buckets, but to also have other authorisation, for example, to edit/create/delete objects. They include:
Storage Object Admin
Users have full control over objects, including listing, creating, viewing, and deleting objects.
Storage Admin
Users have full control of buckets and objects.
For more information on Cloud Storage IAM Roles please see here.
Assuming the google account used to access the URL by your colleague is the one you gave permissions to, you need to also grant "Viewer" role at the project level else he wouldn't be able to login to the GCP console and access the bucket.

Google Cloud storage doesn't show the bucket in browser for a user who has access to it

In our project, we have a group of people which should have full access to ONLY a bucket and they should not see other buckets or the object on the other buckets.
so, i changed the permission of the bucket, and i added the users as Storage Admin for that specific bucket (not for whole project).
In this case, when they use console/Storage they see the following message:
But when they open cloud Shell and they use Gsutil, they can access to the bucket objects (no access to other buckets).
Is this a bug on the interface of Console/storage?
This is not a bug, but it is a subtlety of the Console. In order to access a bucket from the Console, you typically navigate to it using the Browser, which is what appears you attempt in the screenshot. This fails, though, because to do this you need permission to list buckets for a project, even if you otherwise have free reign to work within the bucket.
There are three ways to deal with this:
1) Give your users the Viewer permission for the project that contains the bucket. There are pros and cons to this. I'd say it's probably not worth going this route (though not as much because your users will see other buckets - bucket namespace is publicly viewable anyway - but because doing so brings up some additional permission nuances you probably don't want to deal with).
2) Link directly to the desired bucket, thus avoiding the "listing buckets" portion of the Console. The URL for a bucket has the form: console.cloud.google.com/storage/browser/[BUCKET_NAME]. I believe this will work without any additional modifications to your permissions.
3) Create a custom role that only contains the storage.buckets.list permission, and use that role on the project for affected users.

Access Control List of Google Cloud Storage for huge number of users

I am storing images of one user(owner) in google cloud storage bucket. I wanted to grant read permission for this image to a group of users(contacts of owner).I am planning to use Access Control List for this purpose; e.g., Owner will have full permission to his bucket and the contacts will have read permission on the images. There are chances that owner will have a very huge number of contacts, say 1 million.
So,
will there be any performance issue, if ACL contains a huge number of users?
Will this be the right approach for access control? Or should I consider signed URL?
Regards,Remya
This approach is not going to work for you. There are some significant limitations and downsides to trying to serve content like this. First and foremost, there is a limit of 100 ACL entries on a given object. You could get around this by granting permission to a group for which every user was a member, but even so, it still means that viewing the images will require that every user be logged in to their Google account in addition to however they authenticate for your site.
The canonical way to accomplish this would be to keep all images private and owned by your site's own account. When a user loads a page, verify however you like that they have appropriate authorization to view the images, and if so, generate signed URLs for the images. This allows you to use any authorization scheme without limitation while serving images directly from GCS.