I created IBM cloud object storage and a bucket. When I try to list the buckets in the storage it works but when trying to read or write from bucket I am getting access denied.
Ravithejs-MacBook-Pro:~$ ibmcloud cos put-object --bucket hog-cli-bucket-name --key firstOne --body /Downloads/apikey.json
FAILED
Access to your IBM Cloud account was denied. Log in again by typing ibmcloud login --sso.
I tried logging in with apikey and sso but running into same issue again
Try just apikey
ibmcloud login --apikey <replace with your api key>
Check the permissions for your username.
ibmcloud iam user-policies <user-name>
You should have Manager permission. You can check the bucket permissions as listed in the documentation in this link
Related
I need to view a gcloud bucket with my service account which has a Owner role.
I followed the advice (How do I grant a specific permission to a Cloud IAM service account using the gcloud CLI?) on creating a custom role called BucketViewer, that has the storage.buckets.get permission and assigning it to my service account and the project.
Now, when I try to view the buckets metadata (as described here: https://cloud.google.com/storage/docs/getting-bucket-information#get-bucket-size-cli) :
gsutil ls -L -b gs://bucketname
It still returns the error:
my_service#account.com does not have storage.buckets.get access to the Google Cloud Storage bucket.
I am glad for any help.
I am trying to submit a build using this command in the gcloud CLI:
gcloud builds submit --config cloudbuild.yaml .
but it returns this error:
(gcloud.builds.submit) The user is forbidden from accessing the bucket
[fastapi-api_cloudbuild]. Please check your organization's policy or
if the user has the "serviceusage.services.use" permission
I checked in the IAM admin panel and it says that my account has the owner role and that i have most permissions, and i authenticated in the gcloud cli, but i still get said error, i havent tried much since i have no idea what i could try since i already have the highest role. thank you.
Have you set your project in the cli with gcloud config set project <project>?
Otherwise the answer from this thread might help (billing).
or this one with Storage Admin
I had a similar issues (caused by the change of the billing account), and I fixed it by disabling Google Cloud Build API and re-enabling.
I used this commands:
gcloud services disable cloudbuild.googleapis.com --project "my_project"
gcloud services enable cloudbuild.googleapis.com --project "my_project"
When I run this in cmd line:
gcloud builds submit --tag "gcr.io/<project id>/<cloudrun app name>"
I get this error:
ERROR: (gcloud.builds.submit) HTTPError 403: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>[service accoun name]#[project-id].iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>
Here are the roles I've assigned to the Service account (yes, its overkill, just trying to get it to work):
I've tried these solutions, but they haven't worked:
service account does not have storage.objects.get access for Google Cloud Storage
(gcloud.app.deploy) HTTPError 403: <account> does not have storage.objects.get access to the Google Cloud Storage object
What scopes / roles are required for a service account to be able to submit container builder jobs?
What am I doing wrong?
Hello I had the same issue. Solved it by adding the role "Viewer" to my service account as explained here : https://github.com/google-github-actions/setup-gcloud/issues/105
Could you please confirm that you are using the default service account to trigger your build? If you are using a different service account to trigger the build, use the similar role which your default service account has as well.
Make sure you have the following roles for the service account:
Cloud Build Service account
Service Account User
Cloud Run Admin
You can change the permissions from the Cloud Build Settings page.
Then try running your builds again.
Have you tried creating a new service with a prebuilt demo container from the web console like described here?
We got the same error ("... does not have storage.objects.get access ...") initially, but it worked once we created a first demo service using the Google Cloud Console.
I am attempting to submit a job for training in ML-Engine using gcloud but am running into an error with service account permissions that I can't figure out. The model code exists on a Compute Engine instance from which I am running gcloud ml-engine jobs submit as part of a bash script. I have created a service account (ai-platform-developer#....iam.gserviceaccount.com) for gcloud authentication on the VM instance and have created a bucket for the job and model data. The service account has been granted Storage Object Viewer and Storage Object Creator roles for the bucket and the VM and bucket all belong to the same project.
When I try to submit a job per this tutorial, the following are executed:
time_stamp=`date +"%Y%m%d_%H%M"`
job_name='ObjectDetection_'${time_stamp}
gsutil cp object_detection/samples/configs/faster_rcnn_resnet50.config
gs://[bucket-name]/training_configs/faster-rcnn-resnet50.${job_name}.config
gcloud ml-engine jobs submit training ${job_name} \
--project [project-name] \
--runtime-version 1.12 \
--job-dir=gs://[bucket-name]/jobs/${job_name} \
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz,/tmp/pycocotools/pycocotools-2.0.tar.gz \
--module-name object_detection.model_main \
--region us-central1 \
--config object_detection/training-config.yml \
-- \
--model_dir=gs://[bucket-name]/output/${job_name}} \
--pipeline_config_path=gs://[bucket-name]/training_configs/faster-rcnn-resnet50.${job_name}.config
where [bucket-name] and [project-name] are placeholders for the bucket created above and the project it and the VM are contained in.
The config file is successfully uploaded to the bucket, I can confirm it exists in the cloud console. However, the job fails to submit with the following error:
ERROR: (gcloud.ml-engine.jobs.submit.training) User [ai-platform-developer#....iam.gserviceaccount.com] does not have permission to access project [project-name] (or it may not exist): Field: job_dir Error: You don't have the permission to access the provided directory 'gs://[bucket-name]/jobs/ObjectDetection_20190709_2001'
- '#type': type.googleapis.com/google.rpc.BadRequest
fieldViolations:
- description: You don't have the permission to access the provided directory 'gs://[bucket-name]/jobs/ObjectDetection_20190709_2001'
field: job_dir
If I look in the cloud console, the files specified by --packages exist in that location, and I've ensured the service account ai-platform-developer#....iam.gserviceaccount.com has been given Storage Object Viewer and Storage Object Creator roles for the bucket, which has bucket level permissions set. After ensuring the service account is activated and the default, I can also run
gsutil ls gs://[bucket-name]/jobs/ObjectDetection_20190709_2001
which successfully returns the contents of the folder without a permission error. In the project, there exists a managed service account service-[project-number]#cloud-ml.google.com.iam.gserviceaccount.com and I have also granted this account Storage Object Viewer and Storage Object Creator roles on the bucket.
To confirm this VM is able to submit a job, I am able to switch the gcloud user to my personal account and the script runs and submits a job without any error. However, since this exists in a shared VM, I would like to rely on service account authorization instead of my own user account.
I had a similar problem with exactly the same error.
I found that the easiest way to troubleshoot those errors is to go to "Logging" and search for "PERMISSION DENIED" text.
In my case service account was missing permission "storage.buckets.get". Then you would need to find a role that have this permission. You could do that from IAM->Roles. In that view you could filter roles by permission name. It turned out that only following roles have the needed permission:
Storage Admin
Storage Legacy Bucket Owner
Storage Legacy Bucket Reader
Storage Legacy Bucket Writer
I added "Storage Legacy Bucket Writer" role to the service account in the bucket and then was able to submit a job.
Have you tried to look in the Compute Engine scope?
Shutdown instance, Edit and change Cloud API access scopes to:
Allow full access to all Cloud APIs
Anyone successfully using gcsfuse?
I've tried to remove all default permission to the bucket,
and setup a service account:
gcloud auth activate-service-account to activate serviceaccname
And then running:
gcsfuse --debug_gcs --foreground cloudbuckethere /backup
gcs: Req 0x0: -> ListObjects() (307.880239ms): googleapi: Error 403: xxxxxx-compute#developer.gserviceaccount.com does not have storage.objects.list access
It's weird that it's complaining that the user xxxxx-compute which is not my activated service account:
gcloud auth list
Does show my current service account is active...
I've also granted admin owner, admin object owner, write object, read object to the bucket to my serviceaccname.
If I grant xxxxx-compute to my bucket with all the permission, including legacy permissions, listing seems to work. but writing any file to the directory failed with:
googleapi: Error 403: Insufficient Permission, insufficientPermissions
Anyone have any luck?
I found a solution, not sure if this is a good solution, but it works.
Setup a service account and download the JSON file.
Grant access to the bucket as bucket admin with the above service account name.
Then run add into environment variable, pointing to the path to the service JSON file.
GOOGLE_APPLICATION_CREDENTIALS=/path-to-json/gcloud.json gcsfuse --debug_gcs --foreground bucketname /path-to-mount
Also take note that it may uses large amount of space in the tmp directory by default. Adding flag:
... --temp-dir=/someotherpath
Will really helps if you have limited space in /tmp.