How to share entire Google Cloud Bucket with GSUTIL - google-cloud-storage

Is there a command using GSUTIL that will allow me to share publicly everything in a specific Bucket? Right now, I'm forced to go through and check "share publicly" individually on EVERY SINGLE FILE in the console.

The best way to do this is:
gsutil -m acl ch -u 'AllUsers:R' gs://your-bucket/**
will update ACLs for each existing object in the bucket.
If you want newly created objects in this bucket to also be public, you should also run:
gsutil defacl ch -u 'AllUsers:R' gs://your-bucket
This question was also asked here but the answer recommends using acl set public-read which has the downside of potentially altering your existing ACLs.

$> gsutil acl ch -g All:R -r gs://bucketName
gsutil is the command-line utility for GCS.
"acl ch" means "Modify an ACL."
"-g All:R" means "include read permissions for all users."
"-r" means "recursively"
and the rest is the path.
If you have a whole lot of files and you want MORE SPEED, you can use -m to mean "and also do this multithreaded!", like so:
$> gsutil -m acl ch -g All:R -r gs://bucketName

Related

How do i copy/move all files and subfolders from the current directory to a Google Cloud Storage bucket with gsutil

I'm using gsutil and I need to copy a large number of files/subdirectories from a directory on a windows server to a Google Cloud Storage Bucket.
I have checked the documentation but somehow I can't seem to get the syntax right - I'm trying something along these lines:
c:\test>gsutil -m cp -r . gs://mytestbucket
But I keep getting the message:
CommandException: No URLs matched: .
What am I doing wrong here?
Regards
Morten Hjorth Nielsen
Try gsutil -m cp -r * gs://mytestbucket
Or gsutil -m cp -r *.* gs://mytestbucket
Or if your local directory is called test go one dir up and type: gsutil -m cp -r test gs://mytestbucket
Not sure which syntax you need on Windows, but probably the first.

How to download multiple files in Google Cloud Storage

Scenario: there are multiple folders and many files stored in storage bucket that is accessible by project team members. Instead of downloading individual files one at a time (which is very slow and time consuming), is there a way to download entire folders? Or at least multiple files at once? Is this possible without having to use one of the command consoles? Some of the team members are not tech savvy and need to access these files as simple as possible. Thank you for any help!
I would suggest downloading the files with gsutil. However if you have a large number of files to transfer you might want to use the gsutil -m option, to perform a parallel (multi-threaded/multi-processing) copy:
gsutil -m cp -R gs://your-bucket .
The time reduction for downloading the files can be quite significant. See this Cloud Storage documentation for complete information on the GCS cp command.
If you want to copy into a particular directory, note that the directory must exist first, as gsutils won't create it automatically. (e.g: mkdir my-bucket-local-copy && gsutil -m cp -r gs://your-bucket my-bucket-local-copy)
I recommend they use gsutil. GCS's API deals with only one object at a time. However, its command-line utility, gsutil, is more than happy to download a bunch of objects in parallel, though. Downloading an entire GCS "folder" with gsutil is pretty simple:
$> gsutil cp -r gs://my-bucket/remoteDirectory localDirectory
To download files to local machine need to:
install gsutil to local machine
run Google Cloud SDK Shell
run the command like this (example, for Windows-platform):
gsutil -m cp -r gs://source_folder_path "%userprofile%/Downloads"
gsutil rsync -d -r gs://bucketName .
works for me

Upload "public" directory to Google Cloud Storage

Using this command from SSH I can upload a whole folder into Google Cloud Storage:
gsutil cp -R folder_big gs://bucket_name
Those are files inside the folder:
I don't want to click individually on each file to make it public.
How do I make the folder (and all files inside) automatically public on upload?
You could do:
gsutil cp -a public-read -R folder_big gs://bucket_name
Note: if it's a large folder you would likely get a substantial performance improvement if you use the multi-threading option:
gsutil -m cp -a public-read -R folder_big gs://bucket_name

How can I use gsutil to only make my *.jpg and *.png images public?

I have a bucket with many subdirectories containing a variety of different files. How can I make only my images accessible publicly? The following is great for making everything public but I don't seem to be able to filter on extension.
$ gsutil -m acl set -R -a public-read gs://mybucket
I'd recommend using gsutil acl ch, which will preserve all existing ACLs on your objects and make them publicly readable. This command should do the trick:
gsutil -m acl ch -u AllUsers:R gs://mybucket/**/*.png gs://mybucket/**/*.jpg
Using the canned acl (i.e., acl set -a public-read) can remove other ACL changes that you've made.
gsutil accepts wildcards.
So this works fine:
$ gsutil -m acl set -R -a public-read gs://mybucket/**/*.jpg
$ gsutil -m acl set -R -a public-read gs://mybucket/**/*.png

Google Cloud Storage: bulk edit ACLs

We are in the process of moving our servers into the Google Cloud Compute Engine and starting to look the Cloud Storage as a CDN option. I uploaded about 1,000 files through the Developer Console but the problem is all the Object Permissions for All Users is set at None. I can't find any way to edit all the permissions to give All Users Reader access. Am I missing something?
You can use the gsutil acl ch command to do this as follows:
gsutil -m acl ch -R -g All:R gs://bucket1 gs://bucket2/object ...
where:
-m sets multi-threaded mode, which is faster for a large number of objects
-R recursively processes the bucket and all of its contents
-g All:R grants all users read-only access
See the acl documentation for more details.
You can use Google Cloud Shell as your console via a web browser if you just need to run a single command via gsutil, as it comes preinstalled in your console VM.
In addition to using the gsutil acl command to change the existing ACLs, you can use the gsutil defacl command to set the default object ACL on the bucket as follows:
gsutil defacl set public-read gs://«your bucket»
You can then upload your objects in bulk via:
gsutil -m cp -R «your source directory» gs://«your bucket»
and they will have the correct ACLs set. This will all be much faster than using the web interface.
You can set the access control permission by using "predefinedAcl" the code is as follows.
Storage.Objects.Insert insertObject =client.objects().insert(, ,);
insertObject.setPredefinedAcl("publicRead");
This will work fine
Do not miss to put jolly characters after the bucket's object to apply changes to each files - example:
gsutil -m acl ch -R -g All:R gs://bucket/files/*
for all files inside the 'files' folder, or:
gsutil -m acl ch -R -g All:R gs://bucket/images/*.jpg
for each jpg file inside the 'images' folder.