Changing Bucket from Multi-Region to Single region - google-cloud-storage

Currently we have buckets which are set to multi-region but this can be regional. Is there a way we can change it in Google Cloud Bucket settings (or) should I create a new regional bucket and move the data ? any links to Google documentation will be great.

Related

Creating an IBM Cloud Pak for data account

I would like to sign up for IBM Cloud Pak for data. When I select the nearest region it notifies me that I don't have active accounts. How can I go about this problem? I'm completely new to cloud Pak for data.
I tried selecting different regions but to no avail.

Google Cloud Storage maximum access limits

The system i am building is currently storing videos with Google Cloud Storage, my server will return the link from Google Cloud Storage which is used to play the video on mobile platforms. Is there a limit for how many user can access that link at the same time? . Thank You!
All of the known limits for Cloud Storage are listed in the documentation. It says:
There is no limit to reads of objects in a bucket, which includes reading object data, reading object metadata, and listing objects. Buckets initially support roughly 5000 object reads per second and then scale as needed.
So, no, there are effectively no limits to the number of concurrent downloads.

Change from GCS region settings to multi-region

I have been trying to see if we should setup a GCS bucket with region or multi-region. The needs today are regional only.
I am wondering if there is an option to change from region only to multi-region at a later point in time?
This also saves some $'s till we move to multi-region setting. Just having it enabled, doesn't help us much today.
There is no one-step solution for moving objects from being regional to multi-regional. If you have a bucket containing a lot of regional objects and you want them to be multi-regional, you'll need to copy them into a new multi-regional bucket. The Cloud Storage Transfer Service can manage this for you, or you can do it with gsutil.
You can change a bucket's configuration to cause new objects to have a different storage class, but it won't effect existing objects. Also, this will only help you switching to and from nearline and coldine storage classes, as multi-regional and regional storage classes do not share locations, and bucket locations cannot be changed.

How to create serverless image processing with Google Function and Google Cloud Storage?

On AWS I use it with S3 + Lambda combination. As new image uploaded to a bucket, lambda is triggered and create 3 different sizes of image (small, medium, large). How can I do this with GCS + Function?
PS: I know that there's "getImageServingUrl()", but can this be used with GCE or it's for app engine only?
Would really appreciate any input.
Thanks.
Google Cloud Functions directly supports triggers for new objects being uploaded to GCS: https://cloud.google.com/functions/docs/calling/storage
For finer control, you can also configure a GCS bucket to publish object upload notifications to a Cloud Pub/Sub topic, and then set a subscription on that topic to trigger Google Cloud Functions: https://cloud.google.com/functions/docs/calling/pubsub
Note that there are some quotas on Cloud Functions uploading and downloading resources, so if you need to process more than to 1 Gigabyte of image data per 100 seconds or so, you may need to request a quota increase: https://cloud.google.com/functions/quotas

Google Cloud Storage 'static' bucketname not available

I am currently structuring a web application to serve out segments of our database represented as html iframes. I need to host my Django app's static files (such as bootstrap) in a static file store on Google Cloud Storage in order to correctly represent the HTML elements. However, when I try to create a bucket called 'static', GCS replies with the following error:
Sorry, that name is not available. Please try a different one.
Not only that, it is not allowing me to access or modify the URI, displaying a "Forbidden" message when I attempt to.
Does anyone know how to change this default setting by Google? There is no documentation regarding this..
It seems that the bucket with the given name has been already created by someone else. You have to choose a globally unique name.
Bucket names reside in a single Google Cloud Storage namespace. As a consequence, every bucket name must be unique across the entire Google Cloud Storage namespace. If you try to create a bucket with a bucket name that is already taken, Google Cloud Storage responds with an error message.
Use another name or use the default bucket. If your app was created after the App Engine 1.9.0 release, it should have a default GCS bucket named [your-app-id].appspot.com available. You can create your static files in that bucket and mimic directory structure as follows.
[your-app-id].appspot.com/static/my-file-1