How do you get or generate a URL to the object in a bucket? - google-cloud-storage

I'm storing objects in buckets on google cloud storage. I would like to provide a http url to the object for download. Is there a standard convention or way to expose files stored in cloud storage as http urls?

Yes. Assuming that the objects are publicly accessible:
http://BUCKET_NAME.storage.googleapis.com/OBJECT_NAME
You can also use:
http://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME
Both HTTP and HTTPS work fine. Note that the object must be readable by anonymous users, or else the download will fail. More documentation is available at https://developers.google.com/storage/docs/reference-uris
If it is the case that the objects are NOT publicly accessible and you only want the one user to be able to access them, you can generate a signed URL that will allow only the holder of the URL to download the object, and even then only for a limited period of time. I recommend using one of the GCS client libraries for this, as it's easy to get the signing code slightly wrong: https://developers.google.com/storage/docs/accesscontrol#Signed-URLs

One way is to use https://storage.cloud.google.com// see more documentation at
https://developers.google.com/storage/docs/collaboration#browser

If the file is not public, you can use this link to the file and it will authenticate with your signed in Google account:
https://storage.cloud.google.com/{bucket-name}/{folder/filename}
Otherwise generate a signed URL:
gsutil signurl -d 10m Desktop/private-key.json gs://example-bucket/cat.jpeg

Related

Cloud Storage - Disabled Public Access Prevention, but Failed

Okay, I was using Flutter and Firebase to upload data into Cloud Storage. I gained the downloadURL which can be accessible on web if people know the URL. I had enabled Public Access Prevention in Google Cloud Storage Console based on this doc and chose Access Control Uniform for this on doc.
I also had added Security Rule in Firebase Cloud Storage, so only Users with certain custom token can use it. But, it seems useless as everyone can get its downloaded URL. My question is why is that I still able to access the file if I am using the same URL which was I stored in Firestore? You can test it on this url.
Can hacker get the download URL I downloaded from Firestore?
Is there a secure way to download song from Firebase Cloud Storage so hacker won't get its URL?
Thank you for helping me out.
Updated v2:
I just found out that current audio file has its own AuthenticatedUrl as shown on this picture below. How can I get access to this url?
Updated v1:
I think I haven't activated Firebase App Check. Does this feature have ability to prevent it from being accessed publicly or maybe there is other things that I have to do to be able to prevent it being accessed publicly, beside all ways I described above???
Security rules only check if a user can get the download URL and do not restrict anyone from using it. You can use the getData() method instead. It doesn't return any URL and downloads the files directly and is controlled by security rules. So a user must be authenticated to fetch them.
As mentioned in the Answer :
If you're using the FlutterFire Storage library in your app, you can
call getData on a reference to the file to get its data. So with
that you just need to know the path to the data, and you won't need
the download URL in your application. Once you have the data locally,
you can create an image out of it with: Converting a byte array to
image in Flutter?
Unlike download URLs, the call to getData() is
checked by security rules, so you'll have to ensure that the user is
permitted to access the file.
You can also refer to this Answer :
For web apps: in the JavaScript/Web SDK using a download URL is the
only way to get at the data, while for the native mobile SDKs we also
have getData() and getFile() methods, which are enforced through
security rules.
Until that time, if signed URLs fit your needs
better, you can use those. Both signed URLs and download URLs are just
URLs that provide read-only access to the data. Signed URLs just
expire, while download URLs don't.
For more information, you can refer to this Github issue where a similar issue has been discussed.

Does the signed url cache on Google Cloud Storage?

https://cloud.google.com/storage/docs/access-control/signed-urls
I want to use a signed url.
If I use this url structure, can I also use the cache system?
I want to avoid spending too much traffic from the same files.
In my opinion this is not possible. According to this document cache-control applies to "publicatly accessible" objects. Which if I understand correctly means that they have to be allowed to allUsers.
On the other side accoriding to Signed URL documentation:
When you generate a signed URL, you specify a user or service account
which must have sufficient permission to make the request that the
signed URL will make
As I understand this means that it's not allowed to allUsers and you cannot use cache.
I hope this theoretical thinking will help you somehow :)

GCS Signed Urls with subfolder in bucket

I have a bucket with a sub-folder structure to add media
e.g.
bucket/Org1/ ...
bucket/Org2/ ...
and I want to generate a signed url for all the media inside each subfolder, so users that belongs to organization 1 only can view they files.
Of course I don't want to generate a signed url for each file (can be a lot) and also ACL doesn't work, because my users are logged with a non-google account (and can haven't)
so there is any way to allow like bucket/Org1/* ?
Unfortunately, no. For retrieving objects, signed URLs need to be for exact objects. You'd need to generate one per object.
One way to accomplish this would be to write a small App Engine app that they attempt to download from instead of directly from GCS which would check authentication according to whatever mechanism you're using and then, if they pass, generate a signed URL for that resource and redirect the user.

Upload to Google Cloud Storage via signed URL - object not publicly readable

I followed up this tutorial to allow upload of files from GWT frontend directly to Google Cloud Storage using signed URLs. I've extended the Java example by specifying content type which worked just fine. Then, I saw that files uploaded this way weren't publicly readable. To get this working I've tried:
I've set up default ACL for newly uploaded objects gsutil defacl set public-read gs://<bucket>. Uploaded file again - no luck, stil not visible.
Then tried to set ACL on that object directly gsutil acl set public-read gs://<bucket>/<file> but it gave me AccessDeniedException: 403 Forbidden. It makes sense since gsutil is authenticated with my Google account and signed URL is being created with service account and it's P12 key.
I've tried to set up ACL at upload phase therefore added "x-goog-acl:public-read\n" canonicalized extension headers and appropriate query string param to pass signature check. Damn, stil no luck!
My assumption is that maybe this extension header I'm using is wrong? Then according to documentation all authenticated requests to GCS will apply private ACL by default.
Anyway - why I can't make these files publicly readable from Google Console when I'm logged in as project owner? I can make so for all files uploaded through console (I know that in that case the owner is project owner and not the service account).
What I'm doing wrong? How can I make them publicly readable by anyone?
Thanks in advice!
I think if you gone through the given docs. It clearly mention that, if you need the user to download the object without using the google account then this method provides an assigned URL for specific time to the User to download the object. I am assuming that might be its not possible to make those objects publicly available as they are signed. If still you need that functionality I would recommend you to go through the resumable upload or simple upload of the object.
Also try to put the service account of your project as the owner in the "Edit default permission of Object" in the developer console on the right side of your bucket name.

Using signed url for subfolder

I am evaluating google cloud storage for following use case. I need restrict my users (they do not have gmail accounts) so they can access only their files.
I know that can be done using gsutil signurl. But its gonna be lots of small files and generating signed url for every file is crazy. So wondering if there is trick to provide access to some subfolder using signed url?
Mentioned documentation says that wildcards can be used. Does it mean that it will generate many urls or one url that will apply to all files within wildcard?
You should use per-object ACL for this, absolutely. Signed URLs might be more difficult to implement, and if you're already thinking of managing user accounts, you'll want to do this through OAuth2.0 for Login anyways, so sending the user's Bearer token with any requests you make to the API should come as a magical bonus of doing your user accounts in this way. Read more about Auth with Cloud Storage here.
Unlike the gsutil ls command, the signurl command does not support operations on sub-directories. For example, unless you have an object named some-directory/ stored inside the bucket some-bucket, the following command returns an error: gsutil signurl gs://some-bucket/some-directory/