I can not stopchannel with throwed exception: AccessDeniedException: 403 Caller not owner of subscriptio
I set permission in google developers console already with owner role, but still return exception.
gsutil version: 4.6
You need to use the same service account that was used to create the channel, otherwise you get this error.
You can use the commands gcloud auth list to view the account list that you have in the instance. Use gcloud config set account ACCOUNT to set the service account as active.
Related
I need to view a gcloud bucket with my service account which has a Owner role.
I followed the advice (How do I grant a specific permission to a Cloud IAM service account using the gcloud CLI?) on creating a custom role called BucketViewer, that has the storage.buckets.get permission and assigning it to my service account and the project.
Now, when I try to view the buckets metadata (as described here: https://cloud.google.com/storage/docs/getting-bucket-information#get-bucket-size-cli) :
gsutil ls -L -b gs://bucketname
It still returns the error:
my_service#account.com does not have storage.buckets.get access to the Google Cloud Storage bucket.
I am glad for any help.
I am trying to submit a build using this command in the gcloud CLI:
gcloud builds submit --config cloudbuild.yaml .
but it returns this error:
(gcloud.builds.submit) The user is forbidden from accessing the bucket
[fastapi-api_cloudbuild]. Please check your organization's policy or
if the user has the "serviceusage.services.use" permission
I checked in the IAM admin panel and it says that my account has the owner role and that i have most permissions, and i authenticated in the gcloud cli, but i still get said error, i havent tried much since i have no idea what i could try since i already have the highest role. thank you.
Have you set your project in the cli with gcloud config set project <project>?
Otherwise the answer from this thread might help (billing).
or this one with Storage Admin
I had a similar issues (caused by the change of the billing account), and I fixed it by disabling Google Cloud Build API and re-enabling.
I used this commands:
gcloud services disable cloudbuild.googleapis.com --project "my_project"
gcloud services enable cloudbuild.googleapis.com --project "my_project"
When I run this in cmd line:
gcloud builds submit --tag "gcr.io/<project id>/<cloudrun app name>"
I get this error:
ERROR: (gcloud.builds.submit) HTTPError 403: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>[service accoun name]#[project-id].iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>
Here are the roles I've assigned to the Service account (yes, its overkill, just trying to get it to work):
I've tried these solutions, but they haven't worked:
service account does not have storage.objects.get access for Google Cloud Storage
(gcloud.app.deploy) HTTPError 403: <account> does not have storage.objects.get access to the Google Cloud Storage object
What scopes / roles are required for a service account to be able to submit container builder jobs?
What am I doing wrong?
Hello I had the same issue. Solved it by adding the role "Viewer" to my service account as explained here : https://github.com/google-github-actions/setup-gcloud/issues/105
Could you please confirm that you are using the default service account to trigger your build? If you are using a different service account to trigger the build, use the similar role which your default service account has as well.
Make sure you have the following roles for the service account:
Cloud Build Service account
Service Account User
Cloud Run Admin
You can change the permissions from the Cloud Build Settings page.
Then try running your builds again.
Have you tried creating a new service with a prebuilt demo container from the web console like described here?
We got the same error ("... does not have storage.objects.get access ...") initially, but it worked once we created a first demo service using the Google Cloud Console.
I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.
I launched an instance with service level accounts enabled. For example it has storage-rw set. I verfied that the instance has those. Now whenever i run gsutil ls gs://my_bucket from within the instance I get the error: Failure: unauthorized_client.
gcloud auth list returns
Credentialed accounts:
- xxxx#developer.gserviceaccount.com (active)
I need to use gcloud sdk from an instance because i need more components other than the gcutil and gsutil.
So my question is how can I authorize gcloud to use the xxxx#developer.gserviceaccount.com account and thus the permissions only specified on the instance and not my personal user account which has full permissions to everything?
The gcloud CLI definitely handles Google Compute Engine service accounts. If you see it as "(active)" when you do $ gcloud auth list, that should be sufficient.
Two things can be going wrong here:
You are using the wrong gsutil.
When you install the Google Cloud SDK, it will create google-cloud-sdk/bin/gsutil, and THAT is the one you want to run. Do $ which gsutil to double check. If you're running google-cloud-sdk/platform/gsutil/gsutil, that's the wrong one, and it won't know about anything that gcloud can tell it.
The account doesn't have permissions to access the bucket you're trying to inspect. You'll have to ask the owner of the bucket to add it to the project that owns that bucket.
Source: Engineer for the Google Cloud SDK
See "Authenticating to Google Compute Engine" section in this doc: https://developers.google.com/compute/docs/gcutil/