I am getting the following error when I attempt to access Google Cloud Storage:
GSResponseError: GSResponseError: 403 Forbidden
<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>Missing project id</Details></Error>
I have the correct project id specified in my .boto config file, and I have read/write access to the bucket I'm trying to access. Any ideas as to what might be causing this?
you may need to run gsutil config & give the name of your project -- that will auto-gen your ~/.boto file.
Related
I migrated my website storage to google cloud storage, but the migration/upload was failed for some files because some reasons. By default, if the path/object doesnt exist in the GCS bucket will return XML like below:
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
<Details>No such object: bucket/the_file.png</Details>
</Error>
Can I change/customize that return to another file/image?
If your static website is online by using a Cloud Storage bucket as is described in this document, you need to upload a public file in your bucket called 404.html, this file must have the html code that you desire to shown when an object doesn't exists.
After uploading the file 404.html it is mandatory to run this command in the cloud shell in order to define the error file
gsutil web set -e 404.html gs://www.example.com
In this page you can find more information about the 404.html file
I have been running a batch file to pull files from google bucket which was created by someone and had been working in the past, however, now I'm getting an error message stating
"ACCESS DENIED EXCEPTION:403 tim#gmail.com does not have storage.objects.list access to dcct_-dcm_account870"
What can I do to resolve it?
I just found out the solution to this issue.
I notice that ****#gmail.com have left the company and i have to reconfigure the gsutil to give access to myself using the link below for previous answer
gsutil cors set command returns 403 AccessDeniedException
I am trying to create bucket using gsutil provided by kubernetes.
Below is the response -
$ gsutil mb -c nearline -p kubetest gs://some-bucket
Creating gs://some-bucket/...
AccessDeniedException: 403 hello.user#gmail.com does not have storage.buckets.create access to bucket some-bucket.
I tried the above because when trying run kuberentes on bare metal failed with below exception.
$ cluster/kube-up.sh
... Starting cluster in us-central1-b using provider gce
... calling verify-prereqs
... calling verify-kube-binaries
... calling kube-up
Project: kubetest
Network Project: kubetest
Zone: us-central1-b
BucketNotFoundException: 404 gs://kubernetes-staging-9e9580a659 bucket does not exist.
Creating gs://kubernetes-staging-9e9580a659
Creating gs://kubernetes-staging-9e9580a659/...
AccessDeniedException: 403 hello.user#gmail.com does not have storage.buckets.create access to bucket kubernetes-staging-9e9580a659.
How can I resolve this error and give access to the user?
Got to cloud shell and use the command - gsutil config -b
The gsutil config command obtains access credentials for Google Cloud Storage and writes a boto/gsutil configuration file containing the obtained credentials along with a number of other configuration-controllable values and the flag -b causes gsutil config to launch a browser to obtain OAuth2 approval.
It prompts a URL. [ Hit Allow ]
In your browser you should see a page that requests you to authorize access to Google Cloud Platform APIs and Services on your behalf. After you approve, an authorization code will be displayed.
Copy the verification code and paste to terminal and hit enter
This should resolve the 403.
I'm trying to create a bucket using GCP Deployment Manager. I already went through the QuickStart guide and was able to create a compute.v1.instance. But I'm trying to create a bucket in Google Cloud Storage, but am unable to get anything other than 403 Forbidden.
This is what my template file looks like.
resources:
- type: storage.v1.bucket
name: test-bucket
properties:
project: my-project
name: test-bucket-name
This is what I'm calling
gcloud deployment-manager deployments create deploy-test --config deploy.yml
And this is what I'm receiving back
Waiting for create operation-1474738357403-53d4447edfd79-eed73ce7-cabd72fd...failed.
ERROR: (gcloud.deployment-manager.deployments.create) Error in Operation operation-1474738357403-53d4447edfd79-eed73ce7-cabd72fd: <ErrorValue
errors: [<ErrorsValueListEntry
code: u'RESOURCE_ERROR'
location: u'deploy-test/test-bucket'
message: u'Unexpected response from resource of type storage.v1.bucket: 403 {"code":403,"errors":[{"domain":"global","message":"Forbidden","reason":"forbidden"}],"message":"Forbidden","statusMessage":"Forbidden","requestPath":"https://www.googleapis.com/storage/v1/b/test-bucket"}'>]>
I have credentials setup, and I even created an account owner set of credentials (which can access everything) and I'm still getting this response.
Any ideas or good places to look? Is it my config or do I need to pass additional credentials in my request?
I'm coming from an AWS background, still finding my way around GCP.
Thanks
Buckets on Google Cloud Platform need to be unique.
If you try to create a bucket with a name that is already used by somebody else (on another project), you will receive an ERROR MESSAGE. I would test by creating a new bucket with another name.
gsutil -m acl -r set public-read gs://my_bucket/
command gives AccessDeniedException: 403 Forbidden error even I provide full access to my email id as owner to my_bucket.I am using blobstore api to upload the file in my project. How to solve this problem.
You probably need to set up Cloud API access for your virtual machine. Currently it needs to be set during VM creation process by enabling:
Allow full access to all Cloud APIs
To provide access for VM when you haven't chosen the above setting you need to recreate instance with full access, but there is pending improvement:
Google Cloud Platform Ability to change API access scopes
When it's done we will be able to change settings after shutting down instance.