The gsutil tool is not working to register a channel in object change notification - google-cloud-storage

When executin the follow command:
gsutil notifyconfig watchbucket -i myapp-channel -t myapp-token https://myapp.appspot.com/gcsnotify gs://mybucket
I receive the follow answer, but I used the same command before in another buckets and it worked:
Watching bucket gs://mybucket/ with application URL https://myapp.appspot.com/gcsnotify...
Failure: <HttpError 401 when requesting https://www.googleapis.com/storage/v1beta2/b/mybucket/o/watch?alt=json returned "Unauthorized WebHook callback channel: https://myapp.appspot.com/gcsnotify">.
I used gsutil config to set permissions and tried with gsutil config -e also.
I already tried to set the permissions, made myself owner of the project, but is not working, any help?

I was getting the same error. You must configure gsutil to use a service account before you can watch a bucket.

An additional security requirement was recently added for Object Change Notification. You must add your endpoint domain as a trusted domain on your cloud project. To do that, the domain first has to be whitelisted with the Google Webmaster Tools.
See instructions here:
https://developers.google.com/storage/docs/object-change-notification#_Authorization

I also determined that I needed to:
Whitelist my appspot domain
Create a service account before I can watch a bucket.
At first I was using the google cloud shell and I figured it should just be authenticated. gsutil ls listed the objects in my bucket so I assumed I was authenticated. However that is not the case.
You need to instal gsutil or google cloud sdk, log in, get the .p12 file from the service account, and auth it as Wind Up Toy described. After that it will work.

Related

gsutil ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket even though I'm loggedin in gcloud

I am trying to create an internal app to upload files to google cloud. I don't want each individual user or this app to log in so I'm using a service account. I login into the service account and everything is ok, but when I try to upload it gives me this error:
ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket
As you can see I am logged in with a service account and my account and(neither service or personal) works
I had similar problem, and as always, it took me 2 hours but the solution was trivial, if only it was written somewhere... I needed to login (or authorize, what suits you) to the gsutil in addition to being authorized to the gcloud. I thought they are linked or whatever, but nah. After I ran gsutil config and authorized via the provided link (and code that I pasted back to the console), it started working for me.
Note that I was also logged in to gcloud via a service account linked to my project and having the service account .json key saved locally (see gcloud auth activate-service-account --help.
gcloud auth login solved my issue. You need both steps:
gcloud auth login
gcloud auth application-default login
It happened to me because I had an incomplete initialisation while running gcloud init.
I reinitialised the configuration using gcloud init command and it worked fine.
I can only think of a few things that might cause you to see this error:
Maybe you have an alias set up to a standalone installation of gsutil (which doesn't share credentials with gcloud)?Edit: it's also possible you're invoking the wrong gsutil entry point - make sure you're using <path-to-cloud-sdk>/google-cloud-sdk/bin/gsutil, and not <path-to-cloud-sdk>/google-cloud-sdk/platform/gsutil/gsutil. The platform path will not automatically know about your configured gcloud auth options.
Maybe your service account credentials have moved/are invalid now? If your boto file is referring to a keyfile path and the keyfile was moved, this might happen.
Maybe the gcloud boto file (that gcloud created to use with gsutil when you ran gcloud auth login) is gone. You can run gsutil version -l to see if it's shown in your config path. If gcloud's boto file is present, you should see a line similar to this:
config path(s):
/Users/Daniel/.config/gcloud/legacy_credentials/email#domain.tld/.boto
You can run gsutil version -l to get a bit more info and look into the possibilities above. In particular, these attributes from the output will probably be the most helpful: using cloud sdk, pass cloud sdk credentials to gsutil, config path(s), and gsutil path.
Use this command to resolve some issues
gsutil config
Follow the browser to get a code, then set it in your terminal.
I had the same issue, tried to do gsutil config then it recommended me gcloud auth login which opened google in the browser. After i logged in, i could download with gsutil cp -r gs://my_bucket/Directory local_save_path the entire bucket and save it locally.
I faced the same problem. It took me two days to get this thing working.
I am writing about the whole setup. please refer to step 2 for the answer to the question. FYI my OS is windows 10
Step 1:
Firstly, I faced problems installing gcloud and this is what i did.
The script(.\google-cloud-sdk\install.bat) which is supposed to add gcloud to the path was not working due to permission issues.
I had to add the path manually in two places
1) In the system variables, to the "PATH" variable i added the path to the gcloud bin which should look like - C:\Users\774610\google-cloud-sdk\bin - in my case
2) Additionally gcloud needs python so to the "PATHEXT" variable i appended ".PY" at the end.
After Performing these tasks gcloud started working.
Step 2:
Even though gcloud is working, maven is not able to connect to cloud storage and the error was "401 Anonymous caller does not have storage.objects.list access to bucket"
I was pretty sure i did login to my account and selected the correct project. I also tried adding environment variable as shown in this documentation "https://cloud.google.com/docs/authentication/getting-started"
Nothing seemed to be working even though all the credentials were perfectly setup.
while going through the gcloud documentation I came across this command - "gcloud auth application-default login" which was exactly what i needed.
Refer here for difference between gcloud auth login and gcloud auth application default login
In short what this command does is it obtains your credentials via a web flow and stores them in 'the well-known location for Application Default Credentials' and any code/SDK you run will be able to find the credentials automatically
After this, maven was successfully able to connect to google storage and do its stuff.
Hope this helps, thanks
Does your service account actually have the required permission? The role(s) that will give you this permission are roles/storage.objectViewer / roles/storage.objectAdmin / roles/storage.admin.
Please ensure the service account actually have the permissions in your Cloud Console and then it should work.
--- UPDATE ---
Since you have the correct permission in the account, there it's likely the correct account wasn't used in the gsutil command. This can happen if you have multiple installations of your gsutil tool, please ensure your gsutil has the correct path point to a .BOTO file. There's a similar issue reported on the github repo. You can see if the solution there works.
Ultimately, you can use a new machine / vm with a fresh install to test it out to see if it works. You can this easily by going to the Cloud Console and using the Cloud Shell. No real installation needed, should be very simple to test.
This should work and it will basically isolate your issue (to that of multiple installation) on your original machine. After that, you basically just have to do a clean install to fix it.
If you installed gsutil using python (without gcloud SDK), it may help to run gsutil config and complete steps of initialisation.
Thank you for all the replies.
I would like to share my own experience.
I had to login under the user which is defined when installing Gitlab Runner.
By default, the user indicated in the installation doc is : "gitlab-runner".
So, first, I added a password on this user:
passwd gitlab-runner
then :
su - gitlab-runner
gcloud auth login
gcloud auth application-default login
The issue is solved.
Maybe there is a better way, by directly putting the Google auth files under /home/gitlab-runner
I faced same issue. I used
gcloud auth login
and follow the link
If you are using a service account you need first to authorize it, otherwise gsutil won't have the permission to read/write
gcloud auth activate-service-account --key-file=service_account_file.json
Personally, I had an account with proper permissions registered but I got that error as well despite verifying that my account was running using "sudo gcloud init"
What solved it for me was navigating to the ~/.gutil directory and writing the following
sudo chown jovyan:jovyan *
which let my JupyterLab terminal run, not from root, but from default jovyan.
After that it used my account, not Anonymous caller
Here is another way to edit roles:
gsutil iam ch allUsers:objectViewer gs://tf-learn-objectdetection
Fore more documentation:
gsutil iam help
Use gcloud auth login
Goto mention link
Copy Verification code
Paste Verification code
In my case, even after using gsutils solutions discussed in other answers, I got the error. After checking other google search results, I found out that the reason was that I was authenticating with "my user" while running the gsutils as the root.
Thanks to the answer in the gsutils page in github: https://github.com/GoogleCloudPlatform/gsutil/issues/457
Let me expain what helped me step by step:
First my requirement is to enable CORS, but faced the asked issue, So I followed the below steps:
On Browser side:
Open google cloud console on your browser.
Open Cloud shell editor.
Type gcloud auth login.
Now it will show an command with an url.
Copy that command Don't close browser.
On PC GCloud software side:
Download GCloud Sdk Installer.exe
Open GCLoud in your pc It will ask you to sign In via browser
Signin with correct email id
Select your project from the shown list
Paste the previously copied command
Again it will ask you to signIn
Select the proper account to sign in
Now the GCloud cmd will show you another command with url as output
Copy the output Open your browser, then paste it.
Done! It will show like You are now logged in as xyz#gmail.com
Now I'm able to set CORS without any exception.
Hope these steps will be helpfull for someone who is new to the issue.
Looks like account information is not stored with gsutil
Step 1:
gsutil config
Step 2:
copy url in browser
Step3:
select account and grant permission
Step 4:
Copy key and share it in gsutil promt "step1 will be asking for this key to proceed"
Step 5:
Run command whose access was denied
Thank you Petr Krýže!!! you saved my day...

Google Speech API returns 403 PERMISSION_DENIED

I have been using the Google Speech API to transcribe audio to text from my PHP app (using the Google Cloud PHP Client) for several months without any problem. But my calls have now started to return 403 errors with status "PERMISSION_DENIED" and message "The caller does not have permission".
I'm using the Speech API together with Google Storage. I'm authenticating using a service account and sending my audio data to Storage. That's working, the file gets uploaded. So I understand - but I might be wrong? - that "the caller" does not have permission to then read to the audio data from Storage.
I've been playing with permissions through the Google Console without success. I've read the docs but am quite confused. The service account I am using (I guess this is "the caller"?) has owner permissions on the project. And everything used to work fine, I haven't changed a thing.
I'm not posting code because if I understand correctly my app code isn't the issue - it's rather my Google Cloud settings. I'd be grateful for any idea or clarifications of concepts!
Thanks.
Being an owner of the project doesn't necessarily imply that the service account has read permission on the object. It's possible that the object was uploaded by another account that specified a private ACL or similar.
Make sure that the service account has access to the object by giving it the right permissions on the entire bucket or on the specific object itself.
You can do so using gsutil acl. More information and additional methods may be found in the official documentation.
For instance the following command gives READ permission on an object to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket/object
And this command gives READ permission on an entire bucket to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket
In google cloud vision,when your creating credentials with service account key, you have to create role and set it owner and accesses full permissions

Google cloud credentials totally hosed after attempting to setup boto

I had a gcloud user authenticated and was running gsutils fine from the command line (Windows 8.1). But I needed to access gsutils from a python application so I followed the instructions here:
https://cloud.google.com/storage/docs/xml-api/gspythonlibrary#credentials
I got as far as creating a .boto file, but now not only does the my python code fail (boto.exception.NoAuthHandlerFound: No handler was ready to authenticate.). But I can't run bsutils from the command line any more. I get this error:
C:\>gsutil ls
You are attempting to access protected data with no configured
credentials. Please visit https://cloud.google.com/console#/project
and sign up for an account, and then run the "gcloud auth login"
command to configure gsutil to use these credentials.
I have run gcloud auth and it appears to work, I can query my users:
C:\>gcloud auth list
Credentialed Accounts:
- XXXserviceuser#XXXXX.iam.gserviceaccount.com ACTIVE
- myname#company.name
To set the active account, run:
$ gcloud config set account `ACCOUNT`
I have tried both with the account associated with my email active, and the new serveruser account (created following instructions above). Same "protected data with no configured credentials." error. I tried removing the .boto file, and adding the secret CLIENT_ID and CLIENT_SECRET to my .boto file.
Anyone any ideas what the issue could be?
So I think the latest documentation/examples showing how to use (and authenticate) Google Cloud storage via python is in this repo:
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/storage/api
That just works for me without messing around with keys and service users.
Would be nice if there was a comment somewhere in the old gspythonlibrary docs pointing this out.

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.

gcloud installed on gce instance with service level accounts permission issues

I launched an instance with service level accounts enabled. For example it has storage-rw set. I verfied that the instance has those. Now whenever i run gsutil ls gs://my_bucket from within the instance I get the error: Failure: unauthorized_client.
gcloud auth list returns
Credentialed accounts:
- xxxx#developer.gserviceaccount.com (active)
I need to use gcloud sdk from an instance because i need more components other than the gcutil and gsutil.
So my question is how can I authorize gcloud to use the xxxx#developer.gserviceaccount.com account and thus the permissions only specified on the instance and not my personal user account which has full permissions to everything?
The gcloud CLI definitely handles Google Compute Engine service accounts. If you see it as "(active)" when you do $ gcloud auth list, that should be sufficient.
Two things can be going wrong here:
You are using the wrong gsutil.
When you install the Google Cloud SDK, it will create google-cloud-sdk/bin/gsutil, and THAT is the one you want to run. Do $ which gsutil to double check. If you're running google-cloud-sdk/platform/gsutil/gsutil, that's the wrong one, and it won't know about anything that gcloud can tell it.
The account doesn't have permissions to access the bucket you're trying to inspect. You'll have to ask the owner of the bucket to add it to the project that owns that bucket.
Source: Engineer for the Google Cloud SDK
See "Authenticating to Google Compute Engine" section in this doc: https://developers.google.com/compute/docs/gcutil/