Creating an IBM Cloud Pak for data account - ibm-cloud

I would like to sign up for IBM Cloud Pak for data. When I select the nearest region it notifies me that I don't have active accounts. How can I go about this problem? I'm completely new to cloud Pak for data.
I tried selecting different regions but to no avail.

Related

Changing Bucket from Multi-Region to Single region

Currently we have buckets which are set to multi-region but this can be regional. Is there a way we can change it in Google Cloud Bucket settings (or) should I create a new regional bucket and move the data ? any links to Google documentation will be great.

Can't create cloud object storage bucket under lite plan

I am unable to create any cloud object storage bucket -- I need one.
I Keep getting this error when creating one:
[409, Conflict] The account already has an instance created with the Lite plan.
I don't have any existing bucket.
The error above is indicating that creating the COS instance is failing because there is already an instance created with the Lite plan.
Background:
In the IBM cloud you must first create an instance of the Cloud Object Storage, COS, service. Then you can start creating buckets in the instance.
A plan is a very high level feature set and associated price point. For COS there is a lite and standard plan. I think the lite plan allows you to kick the tires for free and probably has the expected free tier limitations.

Automate / schedule a script

I read a number of blog and watched tutorials - cannot find anything to help me with my problem.
I have a stakeholder that drops files into Google Cloud Storage, I have already written a script that performs ETL tasks to.
It would be great where I can create a trigger which runs my script as soon as the file is dropped in a specific place in Google Cloud Storage.
Google Cloud Storage supports Google Cloud Pub/Sub Notifications. This allows you to programmatically receive notifications when new objects are uploaded to your bucket.

How to create serverless image processing with Google Function and Google Cloud Storage?

On AWS I use it with S3 + Lambda combination. As new image uploaded to a bucket, lambda is triggered and create 3 different sizes of image (small, medium, large). How can I do this with GCS + Function?
PS: I know that there's "getImageServingUrl()", but can this be used with GCE or it's for app engine only?
Would really appreciate any input.
Thanks.
Google Cloud Functions directly supports triggers for new objects being uploaded to GCS: https://cloud.google.com/functions/docs/calling/storage
For finer control, you can also configure a GCS bucket to publish object upload notifications to a Cloud Pub/Sub topic, and then set a subscription on that topic to trigger Google Cloud Functions: https://cloud.google.com/functions/docs/calling/pubsub
Note that there are some quotas on Cloud Functions uploading and downloading resources, so if you need to process more than to 1 Gigabyte of image data per 100 seconds or so, you may need to request a quota increase: https://cloud.google.com/functions/quotas

Google Cloud Platform - Data Distribution

I am trying to figure out a proper solution for the following:
We have a client from which we want to receive data, for instance a binary that is 200Mbytes updated daily. We want them to deposit that data file(s) onto a local server near them (Europe).
We then want to do one of the following:
We want to retrieve the data, either from a local
server where we are (China/HK), or
We can log into their European
server where they have deposited the files and pull the files directly ourselves.
QUESTIONS:
Can Google's clould platform serve as a secure, easy way to provide a cloud drive for which to store and pull the data file?
Does Google's cloud platform distribute such that files pushed onto a server in Europe will be mirrored in a server over in East Asia? (that is, where and how would this distribution model work with regard to my example.)
For storing binary data, Google Cloud Storage is a fine solution. To answer your questions:
Secure: yes. Easy: yes, in that you don't need to write different code depending on your location, but there is a caveat on performance.
Google Cloud Storage replicates files for durability and availability, but it doesn't mirror files across all bucket locations. So for the best performance, you should store the data in a bucket located where you will access it the most frequently. For example, if you create the bucket and choose its location to be Europe, transfers to your European server will be fast but transfers to your HK server will be slow. See the Google Cloud Storage bucket locations documentation for details.
If you need frequent access from both locations, you could create one bucket in each location and keep them in sync with a tool like gsutil rsync