we are planning to use Google cloud storage with signed urls that we can give to users.
So we upload a document
Generate the signed url (using the details mentioned here: https://developers.google.com/storage/docs/accesscontrol#Signed-URLs)
The issue is that google (or) aws etc.. they provide expiration time for the URLs (say : few min/ few hours/ few days etc..) But we want the urls to expire after certain number of requests
Let us say, I generate the URL and send to my user (with some 4 hrs expiration) and we want that url to expire after user access the URL for 2nd time (when the user access the URL for 3rd time, it (google) should not return the data.
Is this possible?
This is not currently possible.
You can achieve this by allowing the service account to read and create objects only. This way the link - once used - can't be used to upload again to the same file since it needs to delete it first which the service account doesn't have permission to.
Related
Okay, I was using Flutter and Firebase to upload data into Cloud Storage. I gained the downloadURL which can be accessible on web if people know the URL. I had enabled Public Access Prevention in Google Cloud Storage Console based on this doc and chose Access Control Uniform for this on doc.
I also had added Security Rule in Firebase Cloud Storage, so only Users with certain custom token can use it. But, it seems useless as everyone can get its downloaded URL. My question is why is that I still able to access the file if I am using the same URL which was I stored in Firestore? You can test it on this url.
Can hacker get the download URL I downloaded from Firestore?
Is there a secure way to download song from Firebase Cloud Storage so hacker won't get its URL?
Thank you for helping me out.
Updated v2:
I just found out that current audio file has its own AuthenticatedUrl as shown on this picture below. How can I get access to this url?
Updated v1:
I think I haven't activated Firebase App Check. Does this feature have ability to prevent it from being accessed publicly or maybe there is other things that I have to do to be able to prevent it being accessed publicly, beside all ways I described above???
Security rules only check if a user can get the download URL and do not restrict anyone from using it. You can use the getData() method instead. It doesn't return any URL and downloads the files directly and is controlled by security rules. So a user must be authenticated to fetch them.
As mentioned in the Answer :
If you're using the FlutterFire Storage library in your app, you can
call getData on a reference to the file to get its data. So with
that you just need to know the path to the data, and you won't need
the download URL in your application. Once you have the data locally,
you can create an image out of it with: Converting a byte array to
image in Flutter?
Unlike download URLs, the call to getData() is
checked by security rules, so you'll have to ensure that the user is
permitted to access the file.
You can also refer to this Answer :
For web apps: in the JavaScript/Web SDK using a download URL is the
only way to get at the data, while for the native mobile SDKs we also
have getData() and getFile() methods, which are enforced through
security rules.
Until that time, if signed URLs fit your needs
better, you can use those. Both signed URLs and download URLs are just
URLs that provide read-only access to the data. Signed URLs just
expire, while download URLs don't.
For more information, you can refer to this Github issue where a similar issue has been discussed.
I need to store my service data in Google Storage and let my users download files depending on their (users) access rights.
I've already made service that connects to Google Storage using server-centric mechanism, and transfers them to client-side, but I need client-side to go to Storage and download file without server-side.
I've tried to use temporary links for files, but I can't check, if user downloaded file or not to properly delete temporary link.
I've tried to look for oauth2 support, but it seems Google doesn't support oauth in such way (When my service decides to allow access or no).
The best solution is to generate tokens for users and if Google Storage would call my service before every file download.
How can I achieve that?
I have a bucket with a sub-folder structure to add media
e.g.
bucket/Org1/ ...
bucket/Org2/ ...
and I want to generate a signed url for all the media inside each subfolder, so users that belongs to organization 1 only can view they files.
Of course I don't want to generate a signed url for each file (can be a lot) and also ACL doesn't work, because my users are logged with a non-google account (and can haven't)
so there is any way to allow like bucket/Org1/* ?
Unfortunately, no. For retrieving objects, signed URLs need to be for exact objects. You'd need to generate one per object.
One way to accomplish this would be to write a small App Engine app that they attempt to download from instead of directly from GCS which would check authentication according to whatever mechanism you're using and then, if they pass, generate a signed URL for that resource and redirect the user.
I am storing images of one user(owner) in google cloud storage bucket. I wanted to grant read permission for this image to a group of users(contacts of owner).I am planning to use Access Control List for this purpose; e.g., Owner will have full permission to his bucket and the contacts will have read permission on the images. There are chances that owner will have a very huge number of contacts, say 1 million.
So,
will there be any performance issue, if ACL contains a huge number of users?
Will this be the right approach for access control? Or should I consider signed URL?
Regards,Remya
This approach is not going to work for you. There are some significant limitations and downsides to trying to serve content like this. First and foremost, there is a limit of 100 ACL entries on a given object. You could get around this by granting permission to a group for which every user was a member, but even so, it still means that viewing the images will require that every user be logged in to their Google account in addition to however they authenticate for your site.
The canonical way to accomplish this would be to keep all images private and owned by your site's own account. When a user loads a page, verify however you like that they have appropriate authorization to view the images, and if so, generate signed URLs for the images. This allows you to use any authorization scheme without limitation while serving images directly from GCS.
I'm quite new to Cloud Storage solutions, and I'm currently researching options to upgrade our current solution (we currently just upload on a SVN server).
What I have is a native application running on client computers, which will upload data to the Cloud Storage. Afterwards, client should be able to download and browse their data (source is not set in stone, could be a website or from other applications). They should not be able to access other user's data.
I'm not sure how I'm supposed to proceed. As far as I understand, the native application will upload using a Native Application Credential, using JSON.
Do I need multiple credentials to track multiple users? That seems wrong to me. Besides when they come back as 'users' through the web interface, they wouldn't be using that authentification, would they?
Do I need to change the ACL of the uploaded files afterwards?
Should I just not give write/read access to any particular users and handle read requests through Signed URLs, dealing with permission details by myself using something else on the side? (not forcing a Google Account is probably a requirement)
Sorry if this is too many questions, and thanks!
Benjamin
The "individual credentials per instance of an app" question has come up before, and unfortunately there's not a great answer. If you want every user to have different permissions, you need every user to be associated with a different account.
Like you point out, the best current answer, other than requiring users to have Google accounts, is to have a centralized service that vends signed URLs to the end applications. That service would be the only owner of all of the objects and would give out permission to read or upload as needed.