Is it possible to change IBM Cloud plan of existing Database? - db2

Is it possible to change IBM Cloud plan of existing Database ?
I am expecting to change the current standard plan to elite plan as its available in Dallas region.

Related

S/4HANA API to extract FSV into Google Cloud

we have FSV (Financial Statement Version) created in SAP. We would like to extract this into our Cloud Storage (Google Cloud/BigQuery). Is it possible?
Currently we only know how to retrieve the Journal entries with G/L accounts, Cost Centers etc, but cannot retrieve FSV data also with it. We are now managing the FSV hierarchy manually in google sheets, which needs to be updated each time there is a change in G/L accounts. Financial Statement Version

What is the differencce between MongoDB Atlas and MongoDB Atlas for AWS

During my investigation on compatible DBs for IoT data storing I looked into MongoDB and pricing is a little bit confusing.
Just wondering what is the difference between MongoDB Atlas and MongoDB Atlas for AWS as they both work on AWS?
And what is the right way to run MongoDB Atlas on AWS?
As far as I can see, they both should mostly be similar :
MongoDB Atlas :
You can directly go to MongoDB-Atlas portal & create a MongoDB cluster(a cluster will usually be 3-shard/node replica set on which a DB is hosted) on either of the cloud providers (AWS/Google/Azure). This way all database updates/maintenance will usually be done by vendor. Quiet easy & simple - Which most people are opting for these days (SAAS/ db hosted on cloud). You can also opt for a free cluster which should be suitable for basic needs kind of learning MongoDB. While creating cluster you can check for pricing which is based on cluster level (M0 to M700), You can upgrade your cluster when ever you wish to, but when I was creating one I've noted that you would pay upfront for a certain amount of years likely 3 & whether you use the money or not you would not get anything back but it you've paid less then you might be charged over the time of usage. You'll pay bills thru MongoDB Atlas.
MongoDB Atlas for AWS :
From here aws-marketplace when you see the text marketplace (where multiple companies/people collaborate to sell products) it's basically these two companies have collaborated to provide MongoDB as SAAS. With this you can actually come from AWS rather than than Atlas from itself. When it comes to pricing AWS seems to provide some credits, It would be better if you can consult AWS & Atlas to check on their pricing & other terms if you really wanted to use it for enterprise purpose. You might end-up owing an AWS account to pay bills for this usage (Which hectic if you don't use AWS for other use-cases). Additionally if you check below on MongoDB Atlas for AWS page it seems like just a starting point is given at AWS side but entire setup would be done at Atlas.
You're charged for your purchase on your AWS bill. After you purchase
a contract, you're directed to the vendor's site to complete setup and
begin using this software.

How do we compare the cost of running MongoDB in GCE vs using Google Cloud Datastore?

I knew MongoDB and Google Cloud Datastore offer NoSQL database systems. I'm new to deploying a database. These are some of my confusions:
How do we compare the cost of running MongoDB in Google Compute Engine vs. using Google Cloud Datastore? Can you quote a small example estimate? We may find many related articles. But I couldn't find one that address this specific question.
I came to know about the Click to Deploy option in Compute Engine. What is the difference(in cost/performance) between manually configuring the Compute Engine for MongoDB and Click to Deploy option?
Within the Click to Deploy option you'll find two options(this and this) we can choose for MongoDB. Can you spot the difference between them? There is also a significant price difference between them.
Is it worthy to start developing in Google Cloud Datastore leaving my MongoDB skills?
I would prefer to know the answers on the basis of cost, development overhead and perfomance.

Google Cloud SQL Read replica's in other regions

We are currently investigating the options to make a partly switch to Google Cloud SQL. What we are searching for is a setup by which data is available for reading in multiple regions to increase the speed of the web-application. Writing from multiple regions would off course be great, but that's not really something MySQL does when you also want to have speed on your side :-)
What we would like to setup is a master-slave setup through which the Master would be in Europe and slaves (for reading) would be available in the US and Asia. This way we can provide information to our customers from a VM + SQL instance in Asia without having to connect to a database in Europe.
As far as I am aware it is not possible to currently add a read-instance outside of the region of the master. Is that correct?
Or, would it be possible to create our own MySQL read-only instance and let it replicate from a Google Cloud SQL instance? This would not be preferable (database administration, server administration) but is off course an option.
You can do cross-region replication in Cloud SQL, although it is not straight forward because the performance will not be great. You have to create a master in Cloud SQL, then create a replica with external master pointing at the master you created: https://cloud.google.com/sql/docs/replication#external-master
You can go in the other direction as well: https://cloud.google.com/sql/docs/replication#replication-external
These features are only supported for first generation of Cloud SQL.
Cloud Spanner is a relational database that supports transactional consistency on a global scale. It is an SQL Database and works great in a Multi-region environment. Therefore, It can be a good choice for your case. For more info, please check https://cloud.google.com/spanner/

Service plan migration for SQL DB

Is it possible to migrate from SQL DB small plan to premium plan? Assume we have started with the small plan and data exceeds 10GB. Can the plan be migrated to premium? If yes, does this include data movement?
No, currently there's no way to automatically migrate your data. You have to manually migrate from the small to the premium plan. You should be able to use the SQL Database console to manually export your data, then once you've subscribed to the premium plan, you should be able to load the data via the SQL DB Console as well.
The SQL Database console does not have ability to import/export data. To move data from one SQLDB instance to another, consider to use the Bluemix DataWorks Data Load REST API.
https://www.ng.bluemix.net/docs/services/dataworks1/index-gentopic1.html#task_d4j_q1r_np
Alternatively, you may also create a Bluemix app so you may import and export data from/to SQL Database service:
http://www.ibm.com/developerworks/cloud/library/cl-sqldb-app/