setting acl using gsutil on google cdn bucket via domain - google-cloud-storage

On documentation https://cloud.google.com/storage/docs/gsutil/commands/acl#ch it says that this command gsutil acl ch -g my-domain.org:R gs://gcs.my-domain.org should grant access to users from domain my-domain.org but I am not sure if this means the referrer of the request to the bucket should have my-domain.org. Can you explain?
And if not then is there a way to protect contents against hotlinking?

No, that refers to users who are logged in with the Google Apps domain #my-domain.org.
There is currently no way to condition ACLs to prevent hotlinking.

Related

AccessDeniedException: 403 Forbidden

gsutil -m acl -r set public-read gs://my_bucket/
command gives AccessDeniedException: 403 Forbidden error even I provide full access to my email id as owner to my_bucket.I am using blobstore api to upload the file in my project. How to solve this problem.
You probably need to set up Cloud API access for your virtual machine. Currently it needs to be set during VM creation process by enabling:
Allow full access to all Cloud APIs
To provide access for VM when you haven't chosen the above setting you need to recreate instance with full access, but there is pending improvement:
Google Cloud Platform Ability to change API access scopes
When it's done we will be able to change settings after shutting down instance.

Google Cloud Storage - make objects in a bucket publicly viewable

I've got a bucket in Google Cloud Storage, and a website. People can currently upload to the bucket through the website (using Google authentication).
However, I need to set it so that anyone can view the files that are uploaded (and can't modify them).
This can't be something that Google needs to authenticate, as some of our clients' IT departments have blocked Google (for whatever reason) and refuse to budge. It could be something where the request is made from my website, it could allow it (as I'll record the URL on the website's database).
Preferably, if this could be done without using gsutil that would be great.
You can set a default object ACL on the bucket that makes all objects uploaded to that bucket publicly readable. For example you could do it using gsutil:
gsutil defacl ch -u AllUsers:R gs://your-bucket
Note that the above command only affects newly written objects. If you already have objects in your bucket that need to be made public you could accomplish that with gsutil as well:
gsutil acl ch -u AllUsers:R gs://your-bucket/**
Regarding your point about making sure anyone can view the files but not modify them: You can accomplish this by making sure the bucket ACL only allows you (or your service account) to write objects, not all users.

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.

how to grant read permission on google cloud storage to another service account

our team create some data on google cloud storage so other team can copy/download/read it from there, but when they tried, they always got 403 forbidden message. I tried to edit the permission on that bucket and added new permission as 'Project', 'viewers-(other team's project id)', and 'Reader', but still they got the same error when they ran this command:
gsutil cp -R gs://our-bucket gs://their-bucket
i also tried with their client id and email account, still the same.
I'm not sure one can define another group's collection of users with a give access right (readers, in this case), and apply it to an object in a different project.
An alternative to this would be to control bucket access via Google Groups: simply set up a group for readers, adding the users you wish to grant this right to. Then you can use said Group to control access to the bucket and/or contents. Further information, and use case scenario, here https://cloud.google.com/storage/docs/collaboration#group
try:
gsutil acl ch -u serviceaccount#google.com:R gs://your-bucket
This ch:changes the permission on 'your-bucket' for u:user serviceaccount#google.com to R:Reader.

The gsutil tool is not working to register a channel in object change notification

When executin the follow command:
gsutil notifyconfig watchbucket -i myapp-channel -t myapp-token https://myapp.appspot.com/gcsnotify gs://mybucket
I receive the follow answer, but I used the same command before in another buckets and it worked:
Watching bucket gs://mybucket/ with application URL https://myapp.appspot.com/gcsnotify...
Failure: <HttpError 401 when requesting https://www.googleapis.com/storage/v1beta2/b/mybucket/o/watch?alt=json returned "Unauthorized WebHook callback channel: https://myapp.appspot.com/gcsnotify">.
I used gsutil config to set permissions and tried with gsutil config -e also.
I already tried to set the permissions, made myself owner of the project, but is not working, any help?
I was getting the same error. You must configure gsutil to use a service account before you can watch a bucket.
An additional security requirement was recently added for Object Change Notification. You must add your endpoint domain as a trusted domain on your cloud project. To do that, the domain first has to be whitelisted with the Google Webmaster Tools.
See instructions here:
https://developers.google.com/storage/docs/object-change-notification#_Authorization
I also determined that I needed to:
Whitelist my appspot domain
Create a service account before I can watch a bucket.
At first I was using the google cloud shell and I figured it should just be authenticated. gsutil ls listed the objects in my bucket so I assumed I was authenticated. However that is not the case.
You need to instal gsutil or google cloud sdk, log in, get the .p12 file from the service account, and auth it as Wind Up Toy described. After that it will work.