Can a Google Storage Object have a max age. I do not want to set the TTL at the bucket level. I looked at the api documentation, I couldnt find a property or method for this in API docs.
No, that is not currently a feature. Right now, TTL can only be set on whole buckets.
Related
I'm trying to figure out if anyone can offer advice around bucket creation for an app that will have users with an album of photos. I was initially thinking of creating a single bucket and then prefixing the filename by user id, since google cloud storage doesn't recognize subdirectories, like so: /bucket-name/user-id1/file.png
Alternatively, I was considering creating a bucket and naming it by user id like so: /user-id1-which-is-also-bucket-name/file.png
I was wondering what I should consider in terms of cost and organization when setting up my google cloud storage. Thank you!
There is no difference in term of cost. In term of organization, it's different:
For the deletion, it's simpler to delete a bucket and not a folder in the unique bucket.
For performances, sharding is better is you have separate bucket (you have less chance to create an hotspot)
At billing perspective, you can add labels on the buckets, and get them in the billing exported to BigQuery. You can know the cost of the bucket per user, and maybe do a rebill to them
The biggest advantage of 1 bucket per user model is the security. You can grant a user (if the users have direct access to the bucket and don't use a backend service to access it) on a bucket, without the use of legacy (and almost deprecated) ACL on object. In addition, if you use ACL, you can't set ACL per folder, ACL are per object. So, everytime that you add an object in the unique bucket, you have to set the ACL on it. It's harder to achieve.
IMO, 1 bucket per user is the best model.
The system i am building is currently storing videos with Google Cloud Storage, my server will return the link from Google Cloud Storage which is used to play the video on mobile platforms. Is there a limit for how many user can access that link at the same time? . Thank You!
All of the known limits for Cloud Storage are listed in the documentation. It says:
There is no limit to reads of objects in a bucket, which includes reading object data, reading object metadata, and listing objects. Buckets initially support roughly 5000 object reads per second and then scale as needed.
So, no, there are effectively no limits to the number of concurrent downloads.
Is there a way how to apply upload limit for google storage bucket per day/month/year?
Is there is a way how to apply limit on amount of Network traffic?
Is there is a way how to apply limit on Class A operations?
Is there is a way how to apply limit on Class B operations?
I found only Queries per 100 seconds per user and Queries per day using
https://cloud.google.com/docs/quota instructions, but this is JSON Api quotas
(I even not sure what kind of api is used inside of StorageClient c# client class)
For defining Quotas, and by the way SLO, you need to have SLI: Service level indicator. that means to have metrics on what you want to observe.
Here, it's not the case. Cloud Storage haven't indicator on the volume of data per day. Thus, you don't have built in indicator and metrics, ... and quotas.
If you want it, you have to build something by your own. To wrap all the Cloud Storage call in a service that count the volume of blob per days and then you will be able to apply your own rules on this personal indicator.
Of course, for preventing any by pass, you have to deny direct access to the buckets and only grant your "indicator service" to access them. Same things for the bucket creation, to register the new buckets in your service.
Not an easy task...
I'm working with Directions API, trying to get the best route from one point, pass through another places and return to the origin. But the max waypoints the API allows me to put is 25, and I need 120 on average.
Is there a way to change this restriction? Or maybe another service that allows more than 25 waypoints?
It's not currently possible to add more than 25 waypoints for Directions API.
However, there is an open feature request in Google's Issue Tracker which I suggest starring to increase visibility and subscribe to future notifications:
https://issuetracker.google.com/issues/35824756
Hope this helps!
Is there a way through either the IBM Cloud API, or the Softlayer API, to programmatically run/schedule/setup snapshots on an endurance storage device? aka iSCSI drive.
I've looked through the documentation, but have not found anything.
you need to take a look at these methods:
https://sldn.softlayer.com/reference/services/softlayer_network_storage/createsnapshot
the method above will allow you to create a new manual snapshot
https://sldn.softlayer.com/reference/services/softlayer_network_storage/enablesnapshots
the method above will allow you to schedule the snapshots
see bellow some examples of code:
https://softlayer.github.io/php/enableSnapshots/
https://softlayer.github.io/rest/createsnapshot/