gcloud build. <service account> does not have storage.objects.get access to the Google Cloud Storage object - gcloud

When I run this in cmd line:
gcloud builds submit --tag "gcr.io/<project id>/<cloudrun app name>"
I get this error:
ERROR: (gcloud.builds.submit) HTTPError 403: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>[service accoun name]#[project-id].iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>
Here are the roles I've assigned to the Service account (yes, its overkill, just trying to get it to work):
I've tried these solutions, but they haven't worked:
service account does not have storage.objects.get access for Google Cloud Storage
(gcloud.app.deploy) HTTPError 403: <account> does not have storage.objects.get access to the Google Cloud Storage object
What scopes / roles are required for a service account to be able to submit container builder jobs?
What am I doing wrong?

Hello I had the same issue. Solved it by adding the role "Viewer" to my service account as explained here : https://github.com/google-github-actions/setup-gcloud/issues/105

Could you please confirm that you are using the default service account to trigger your build? If you are using a different service account to trigger the build, use the similar role which your default service account has as well.
Make sure you have the following roles for the service account:
Cloud Build Service account
Service Account User
Cloud Run Admin
You can change the permissions from the Cloud Build Settings page.
Then try running your builds again.

Have you tried creating a new service with a prebuilt demo container from the web console like described here?
We got the same error ("... does not have storage.objects.get access ...") initially, but it worked once we created a first demo service using the Google Cloud Console.

Related

permission error: service account don't have access to cloud-ml platform

I am running Kubeflow pipeline(docker approach) and cluster uses the endpoint to navigate to the dashboard. The Clusters is created followed by the instructions mentioned in this link Deploy Kubeflow. Everything is successfully created and the cluster generated the endpoints and its working perfectly.
Endpoint link would be something like this https://appname.endpoints.projectname.cloud.goog.
Every workload of the pipeline is working fine except the last one. In the last workload, I am trying to submit a job to the cloud-ml engine. But it logs shows that the application has no access to the project. Here is the full image of the log.
ERROR:
(gcloud.ml-engine.versions.create) PERMISSION_DENIED: Request had
insufficient authentication scopes.
ERROR:
(gcloud.ml-engine.jobs.submit.prediction) User
[clustername#project_name.iam.gserviceaccount.com]
does not have permission to access project [project_name]
(or it may not exist): Request had insufficient authentication scopes.
From the logs, it's clear that this service account doesn't have access to the project itself. However, I tried to give access for Cloud ML Service to this service account but still, it's throwing the same error.
Any other ways to give Cloud ML service credentials to this application.
Check two things:
1) GCP IAM: if clustername-user#projectname.iam.gserviceaccount.com has ML Engine Admin permission.
2) Your pipeline DSL: if the cloud-ml engine step calls apply(gcp.use_gcp_secret('user-gcp-sa')), e.g. https://github.com/kubeflow/pipelines/blob/ea07b33b8e7173a05138d9dbbd7e1ce20c959db3/samples/tfx/taxi-cab-classification-pipeline.py#L67

Recovering access after initially provisioning wrong scopes for an instance

I recently created a VM, but mistakenly gave the default service account Storage: Read Only permissions instead of the intended Read Write under "Identity & API access", so GCS write operations from the VM are now failing.
I realized my mistake, so following the advice in this answer, I stopped the VM, changed the scope to Read Write and started the VM. However, when I SSH in, I'm still getting 403 errors when trying to create buckets.
$ gsutil mb gs://some-random-bucket
Creating gs://some-random-bucket/...
AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.
Acceptable scopes: https://www.googleapis.com/auth/cloud-platform
How can I fix this? I'm using the default service account, and don't have the IAM permissions to be able to create new ones.
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* (projectnum)-compute#developer.gserviceaccount.com
I will suggest you to try add the scope "cloud-platform" to the instance by running the gcloud command below
gcloud alpha compute instances set-scopes INSTANCE_NAME [--zone=ZONE]
[--scopes=[SCOPE,…] [--service-account=SERVICE_ACCOUNT
As a scopes put "https://www.googleapis.com/auth/cloud-platform" since it give Full access to all Google Cloud Platform resources.
Here is gcloud documentation
Try creating the Google Cloud Storage bucket with your user account.
Type gcloud auth login and access the link you are provided, once there, copy the code and paste it into the command line.
Then do gsutil mb gs://bucket-name.
The security model has 2 things at play, API Scopes and IAM permissions. Access is determined by the AND of them. So you need an acceptable scope and enough IAM privileges in order to do whatever action.
API Scopes are bound to the credentials. They are represented by a URL like, https://www.googleapis.com/auth/cloud-platform.
IAM permissions are bound to the identity. These are setup in the Cloud Console's IAM & admin > IAM section.
This means you can have 2 VMs with the default service account but both have different levels of access.
For simplicity you generally want to just set the IAM permissions and use the cloud-platform API auth scope.
To check if you have this setup go to the VM in cloud console and you'll see something like:
Cloud API access scopes
Allow full access to all Cloud APIs
When you SSH into the VM by default gcloud will be logged in as the service account on the VM. I'd discourage logging in as yourself otherwise you more or less break gcloud's configuration to read the default service account.
Once you have this setup you should be able to use gsutil properly.

Google Cloud (GCE) doesn't have access to default service account while creating VM instance

Looks like there's a bug.
Following official documentation:
https://cloud.google.com/compute/docs/instances/create-start-instance
After
gcloud compute instances create test-2
Recieved:
ERROR: (gcloud.compute.instances.create) Could not fetch resource:
- The resource '1045904521672-compute#developer.gserviceaccount.com' of type 'serviceAccount' was not found.
Being authorized correctly, role is set to Owner.
> gcloud auth list
returns
Credentialed Accounts
ACTIVE ACCOUNT
* **#gmail.com
To set the active account, run:
$ gcloud config set account `ACCOUNT`
However, an instance can be created with custom service account.
Any ideas?
Thank you in advance.
This seems similar to: Unable to create cluster on Dataproc after deleting default service account
Perhaps the answers there can help you out.

Service Account Authentication fails with gsutil for DCM CS bucket(Google-owned API Console Project)

I've done an extensive research but I can't find a solution.
How can I enable Service Account Authentication for a project that is linked with Google's private owned Bucket for Double Click Manager data? (more info on the current setup of this project here https://support.google.com/dcm/partner/answer/2941575?hl=en&ref_topic=6107456&rd=1).
Separate user authentication works with gsutil(navigating to browser->get token->paste back in your cmd->issue commands) but when it comes to configuring a service account I keep getting
AccessDeniedException: 403 Forbidden
What am I missing? Since the Google documentation says that this specific bucket can't be listed under Cloud Storage for that project, then the project and the service account should be linked to that bucket by default so I can't see the issue here.
During set-up you should have created a Google Group to control access to your bucket. You should add the service account email address to that group, and it will then be able to access the bucket.

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.