gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE - google-cloud-storage

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?

One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)

You have to log in with an account that has the permissions you need for that project:
gcloud auth login

gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.

Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .

I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.

After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ

So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:

From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.

Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.

Related

Recovering access after initially provisioning wrong scopes for an instance

I recently created a VM, but mistakenly gave the default service account Storage: Read Only permissions instead of the intended Read Write under "Identity & API access", so GCS write operations from the VM are now failing.
I realized my mistake, so following the advice in this answer, I stopped the VM, changed the scope to Read Write and started the VM. However, when I SSH in, I'm still getting 403 errors when trying to create buckets.
$ gsutil mb gs://some-random-bucket
Creating gs://some-random-bucket/...
AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.
Acceptable scopes: https://www.googleapis.com/auth/cloud-platform
How can I fix this? I'm using the default service account, and don't have the IAM permissions to be able to create new ones.
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* (projectnum)-compute#developer.gserviceaccount.com
I will suggest you to try add the scope "cloud-platform" to the instance by running the gcloud command below
gcloud alpha compute instances set-scopes INSTANCE_NAME [--zone=ZONE]
[--scopes=[SCOPE,…] [--service-account=SERVICE_ACCOUNT
As a scopes put "https://www.googleapis.com/auth/cloud-platform" since it give Full access to all Google Cloud Platform resources.
Here is gcloud documentation
Try creating the Google Cloud Storage bucket with your user account.
Type gcloud auth login and access the link you are provided, once there, copy the code and paste it into the command line.
Then do gsutil mb gs://bucket-name.
The security model has 2 things at play, API Scopes and IAM permissions. Access is determined by the AND of them. So you need an acceptable scope and enough IAM privileges in order to do whatever action.
API Scopes are bound to the credentials. They are represented by a URL like, https://www.googleapis.com/auth/cloud-platform.
IAM permissions are bound to the identity. These are setup in the Cloud Console's IAM & admin > IAM section.
This means you can have 2 VMs with the default service account but both have different levels of access.
For simplicity you generally want to just set the IAM permissions and use the cloud-platform API auth scope.
To check if you have this setup go to the VM in cloud console and you'll see something like:
Cloud API access scopes
Allow full access to all Cloud APIs
When you SSH into the VM by default gcloud will be logged in as the service account on the VM. I'd discourage logging in as yourself otherwise you more or less break gcloud's configuration to read the default service account.
Once you have this setup you should be able to use gsutil properly.

Cloud composer: "PERMISSION_DENIED: The caller does not have permission"

I implemented a few tasks with BashOperator. Ones with "gsutil rm" and "gsutil cp" worked fine. But one with "gcloud alpha firestore export" generates this error:
{bash_operator.py:101} INFO - ERROR: (gcloud.alpha.firestore.export) PERMISSION_DENIED: The caller does not have permission
This command itself works fine in gcloud shell. I tried to give some Firestore related permissions to the service account used by the Composer but it still doesn't work. Any idea
It might be that you don't have permissions for a particular project.
The error I was getting was: PERMISSION_DENIED: Caller does not have required permission to use project project:random-id-11111.
The way I resolved it was by running gcloud config set project 'the-right-project-id' and then the actual gcloud command.
I think you need Cloud Datastore Import Export access. Following are the steps as per current Current Google Cloud platform layout.
https://console.cloud.google.com > Left drawer > IAM & admin > Against user - Edit Icon > Add another role > Data Store > Cloud Datastore Import & Export > Save
Try creating a new service account with the Firestore-related permissions needed and using that on a freshly created environment. https://cloud.google.com/composer/docs/how-to/access-control
Other debugging ideas: * Try ssh-ing into the Kubernetes workers on your Composer environment and running the command. *Is the Firestore API enabled on your project?
Open https://console.cloud.google.com/iam-admin/iam
Find the service account you're using for the backups
Add the Owner role to the service account
It's not really intuitive or logic because there are not permissions or roles for Firestore.
Unfortunately it took me way to long to figure it out. I hope it helps others!
Similar to Roy's answer, the issue for me was that gcloud was set to a different project.
check which project it is set to
gcloud config list
list which projects you have access to
gcloud projects list
set the correct project
gcloud config set project 'foo-project'
Grant firebase admin role to the default service account that your service is using.
Adding Owner role to the service account seems too much privilege for just taking backup.
In IAM & Admin make sure your #appspot.gserviceaccount.com must have access for 3 things:
Cloud Functions Admin
Cloud Datastore Import Export Admin
Storage Admin
you need to set your project first where you are owner
gcloud config set project project-id
You can find your project id by clicking on gcloud console it will be there in popup in project-name-somerandomnumbers
I got caught out on this today. The issue was that I had set up my service account correctly in the IAM settings, but hadn't realised an invitation had been sent to that email address which I needed to accept. Worked immediately once I accepted the invite.

Right way of using Google Storage on a GCE VM

I want to know the right/best way of having one machine copying data to Google Storage.
I need one machine to be able to write to a bucket, but not be able to create or delete other buckets.
While researching, I found out that you should create a account service so this account can log in to GC and then use the storage.
But the problem is, when the machine is from GCE, there are scopes. When setting up the scope "Default" it can Read from Google Storage, but can not write to it. Even after authenticated with a service account.
When the scope is Devstorage.read_write now the machine can create and remove buckets from that storage without login. I find that to risk.
Does anyone have any recommendations?
Thanks
The core problem here is that the "write" scope covers both write and delete, and that the GCE service account is likely a member of project-editors, which can create and delete buckets. It sounds like what you want to do is restrict a service account to only being able to affect a single bucket. You should be able to do this with these steps:
Create a service account in your project (and save the private key file).
In the permissions page for the project, make sure that service account is not a project editor for your project.
Using an account that does have full permissions to your project, create the bucket, then grant the service account write access to the bucket. Example gsutil commands to do this:
gsutil mb gs://yourbucket
gsutil acl ch -u your-service-account-name#gserviceaccount.com:W gs://yourbucket
Create a VM that does not have a GCE service account enabled.
Push the service account's private key file to that VM.
On the VM, gcloud auth activate-service-account --key-file=your-key-file.json
Now gsutil commands run on the VM should be able to write to (and delete) objects in that bucket, but not any other buckets in your project.

gcloud installed on gce instance with service level accounts permission issues

I launched an instance with service level accounts enabled. For example it has storage-rw set. I verfied that the instance has those. Now whenever i run gsutil ls gs://my_bucket from within the instance I get the error: Failure: unauthorized_client.
gcloud auth list returns
Credentialed accounts:
- xxxx#developer.gserviceaccount.com (active)
I need to use gcloud sdk from an instance because i need more components other than the gcutil and gsutil.
So my question is how can I authorize gcloud to use the xxxx#developer.gserviceaccount.com account and thus the permissions only specified on the instance and not my personal user account which has full permissions to everything?
The gcloud CLI definitely handles Google Compute Engine service accounts. If you see it as "(active)" when you do $ gcloud auth list, that should be sufficient.
Two things can be going wrong here:
You are using the wrong gsutil.
When you install the Google Cloud SDK, it will create google-cloud-sdk/bin/gsutil, and THAT is the one you want to run. Do $ which gsutil to double check. If you're running google-cloud-sdk/platform/gsutil/gsutil, that's the wrong one, and it won't know about anything that gcloud can tell it.
The account doesn't have permissions to access the bucket you're trying to inspect. You'll have to ask the owner of the bucket to add it to the project that owns that bucket.
Source: Engineer for the Google Cloud SDK
See "Authenticating to Google Compute Engine" section in this doc: https://developers.google.com/compute/docs/gcutil/

The gsutil tool is not working to register a channel in object change notification

When executin the follow command:
gsutil notifyconfig watchbucket -i myapp-channel -t myapp-token https://myapp.appspot.com/gcsnotify gs://mybucket
I receive the follow answer, but I used the same command before in another buckets and it worked:
Watching bucket gs://mybucket/ with application URL https://myapp.appspot.com/gcsnotify...
Failure: <HttpError 401 when requesting https://www.googleapis.com/storage/v1beta2/b/mybucket/o/watch?alt=json returned "Unauthorized WebHook callback channel: https://myapp.appspot.com/gcsnotify">.
I used gsutil config to set permissions and tried with gsutil config -e also.
I already tried to set the permissions, made myself owner of the project, but is not working, any help?
I was getting the same error. You must configure gsutil to use a service account before you can watch a bucket.
An additional security requirement was recently added for Object Change Notification. You must add your endpoint domain as a trusted domain on your cloud project. To do that, the domain first has to be whitelisted with the Google Webmaster Tools.
See instructions here:
https://developers.google.com/storage/docs/object-change-notification#_Authorization
I also determined that I needed to:
Whitelist my appspot domain
Create a service account before I can watch a bucket.
At first I was using the google cloud shell and I figured it should just be authenticated. gsutil ls listed the objects in my bucket so I assumed I was authenticated. However that is not the case.
You need to instal gsutil or google cloud sdk, log in, get the .p12 file from the service account, and auth it as Wind Up Toy described. After that it will work.