Getting 403 response when trying to change meta data of objects - google-cloud-storage

I'm the owner of the account and project.
I login using the google cloud sdk and try the following command:
gsutils -m setmeta -h "Cache-Control:public, max-age=3600" gs//bucket/**/*.*
I get the following error for some of the files:
AccessDeniedException: 403 <owner#email.com> does not have storage.objects.update access to <filePath>
Most of the files are updated, but some are not. Because I have a lot of files, if 10% are not updated, that means a few gigs of data is not updated.
Any idea why this happens with an owner account and how to fix this?

If the Access Control on your bucket is set to Uniform you need to add permissions to it even if you are project owner.
For example:
I have a test file in a bucket and when I want to access it I get an access required popup.
I gave to my Owner account in the permissions tab of the bucket "Storage Object Admin" and now I can access it freely.
Here you have more info about Project Level Roles vs Bucket Level Roles.
Let me know.

Related

Can't remove OWNER access to a Google Cloud Storage object

I have a server that writes some data files to a Cloud Storage bucket, using a service account to which I have granted "Storage Object Creator" permissions for the bucket. I want that service account's permissions to be write-only.
The Storage Object Creator permission also allows read access, as far as I can tell, so I wanted to just remove the permission for the objects after they have been written. I thought I could use an ACL to do this, but it doesn't seem to work. If I use
gsutil acl get gs://bucket/object > acl.json
then edit acl.json to remove the OWNER permission for the service account, then use
gsutil acel set acl.json gs://bucket/object
to update the ACL, I find that nothing has changed; the OWNER permission is still there if I check the ACL again. The same thing happens if I try to remove the OWNER permission in the Cloud Console web interface.
Is there a way to remove that permission? Or another way to accomplish this?
You cannot remove the OWNER permissions for the service account that uploaded the object, from:
https://cloud.google.com/storage/docs/access-control/lists#bestpractices
The bucket or object owner always has OWNER permission of the bucket or object.
The owner of a bucket is the project owners group, and the owner of an object is either the user who uploaded the object, or the project owners group if the object was uploaded by an anonymous user.
When you apply a new ACL to a bucket or object, Cloud Storage respectively adds OWNER permission to the bucket or object owner if you omit the grants.
I have not tried this, but you could upload the objects using once service account (call it SA1), then rewrite the objects using a separate service account (call it SA2), and then delete the objects. SA1 will no longer be the owner, and therefore won't have read permissions. SA2 will continue to have both read and write permissions though, there is no way to prevent the owner of an object from reading it.
Renaming the object does the trick.
gsutil mv -p gs://bucket/object gs://bucket/object-renamed
gsutil mv -p gs://bucket/object-renamed gs://bucket/object
The renamer service account will become the object OWNER.

Google Speech API returns 403 PERMISSION_DENIED

I have been using the Google Speech API to transcribe audio to text from my PHP app (using the Google Cloud PHP Client) for several months without any problem. But my calls have now started to return 403 errors with status "PERMISSION_DENIED" and message "The caller does not have permission".
I'm using the Speech API together with Google Storage. I'm authenticating using a service account and sending my audio data to Storage. That's working, the file gets uploaded. So I understand - but I might be wrong? - that "the caller" does not have permission to then read to the audio data from Storage.
I've been playing with permissions through the Google Console without success. I've read the docs but am quite confused. The service account I am using (I guess this is "the caller"?) has owner permissions on the project. And everything used to work fine, I haven't changed a thing.
I'm not posting code because if I understand correctly my app code isn't the issue - it's rather my Google Cloud settings. I'd be grateful for any idea or clarifications of concepts!
Thanks.
Being an owner of the project doesn't necessarily imply that the service account has read permission on the object. It's possible that the object was uploaded by another account that specified a private ACL or similar.
Make sure that the service account has access to the object by giving it the right permissions on the entire bucket or on the specific object itself.
You can do so using gsutil acl. More information and additional methods may be found in the official documentation.
For instance the following command gives READ permission on an object to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket/object
And this command gives READ permission on an entire bucket to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket
In google cloud vision,when your creating credentials with service account key, you have to create role and set it owner and accesses full permissions

Google Cloud Storage ACL confusion

I'm the owner of a Google Cloud project, with a Google Cloud Storage bucket inside. All our backups are moved to this bucket. When I try to retrieve some of the backups, I get a permission denied. I'm not able to do anything but to list the bucket.
When I try to reset the bucket ACL with
gsutil acl ch -u xxx#yyy.zzz:FC gs://abc/**
i get the following error.
CommandException: Failed to set acl for gs://abc/1234.sql. Please ensure you have OWNER-role access to this resource.
Which makes no sense, since I'm the project and bucket owner.
I gave myself "Storage Admin" and "Storage Object Admin/Creator/Viewer" in IAM rights and I'm now able to access all files.
I gave "Storage Legacy Bucket Owner" permission to the owner account at bucket level works for me. Following command will add Owners of project and Viewers of project as a "Storage Legacy Bucket Owner" to the bucket.
$gsutil acl ch -p viewers-yourprojectnumber:O gs://test_buk04

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.

how to grant read permission on google cloud storage to another service account

our team create some data on google cloud storage so other team can copy/download/read it from there, but when they tried, they always got 403 forbidden message. I tried to edit the permission on that bucket and added new permission as 'Project', 'viewers-(other team's project id)', and 'Reader', but still they got the same error when they ran this command:
gsutil cp -R gs://our-bucket gs://their-bucket
i also tried with their client id and email account, still the same.
I'm not sure one can define another group's collection of users with a give access right (readers, in this case), and apply it to an object in a different project.
An alternative to this would be to control bucket access via Google Groups: simply set up a group for readers, adding the users you wish to grant this right to. Then you can use said Group to control access to the bucket and/or contents. Further information, and use case scenario, here https://cloud.google.com/storage/docs/collaboration#group
try:
gsutil acl ch -u serviceaccount#google.com:R gs://your-bucket
This ch:changes the permission on 'your-bucket' for u:user serviceaccount#google.com to R:Reader.