How to migrate a db2 database - db2

I have a db2 database on a device which does not have internect connectivity and would like to move this database to another place. I have taken a backup from my database, can I use the 'restore' command to create a clone of this database?

To move a Db2 database from on-premise to Db2 Warehouse on Cloud, one option is to use IBM Lift CLI. https://www.lift-cli.cloud.ibm.com/
You will need to have internet connectivity from the place your data resides on-premise - either a Db2 database, or failing that CSV files.
If all you have is a Db2 backup image, you would need to restore that into a local database, or use the (chargeable) utility High Performance Unload to extract table data as CSV files from the backup image. IBM Lift CLI does not support Db2 backup images as a source for migration to the cloud.
Note that if you use CSV files, you will need to extract the DDL from the source database and create the empty tables on the target.

Related

Copy data from Postgres DB (GCP Project A) to another Postgres DB (GCP Project B)

I would be happy to get your help / feedback re data load.
Goal:
Load source data from a Postgres database, which is located in GCP project A to another Postgres database, which is located in GCP project B.
Challenge:
Get a connection (I have an IAM account with sufficient rights to run a COPY TO / COPY FROM command) to the Postgres DB in GCP Project A and copy the table either to a CSV or create a dump that can be used in order to be inserted to another Postgres DB in GCP Project B.
How do I connect to the database (e.g. if I create a key, where shall I store the json keyfile and would that approach even be feasible?) with this IAM email account?
Other ways I've researched were to use psycopg2 (thus I could use the function cursor.copy_expert (which doesn’t need any superuser right or Postgres user credentials and copy the data), but I didn’t succeed in connecting to the database with psycopg2 due to challenges with cloud proxy.
Another idea was to use pg_dump or gcloud sql export csv.
I would be curious if some of you were facing a similar challenge and how did you solve it and what might be the best way/practice
You can have a try out database migration service. You can set up a continuous migration configuration and use Cloud SQL for PostgreSQL.
Hello after a lot of searching I've come to these solutions:
If you have continuous copy, you need to use the database migration service, check this documentation.
If you have one shot copy:
you can restore your instance, see the bottom page of this documentation
you can create a bucket and backup your instance on it, then import it from the other project

loading one table from RDS / postgres into Redshift

We have a Redshift cluster that needs one table from one of our RDS / postgres databases. I'm not quite sure the best way to export that data and bring it in, what the exact steps should be.
In piecing together various blogs and articles the consensus appears to be using pg_dump to copy the table to a csv file, then copying it to an S3 bucket, and from there use the Redshift COPY command to bring it in to a new table-- that's my high level understanding, but am not sure what the command line switches should be, or the actual details. Is anyone doing this currently and if so, is what I have above the 'recommended' way to do a one-off import into Redshift?
It appears that you want to:
Export from Amazon RDS PostgreSQL
Import into Amazon Redshift
From Exporting data from an RDS for PostgreSQL DB instance to Amazon S3 - Amazon Relational Database Service:
You can query data from an RDS for PostgreSQL DB instance and export it directly into files stored in an Amazon S3 bucket. To do this, you use the aws_s3 PostgreSQL extension that Amazon RDS provides.
This will save a CSV file into Amazon S3.
You can then use the Amazon Redshift COPY command to load this CSV file into an existing Redshift table.
You will need some way to orchestrate these operations, which would involve running a command against the RDS database, waiting for it to finish, then running a command in the Redshift database. This could be done via a Python script that connects to each database (eg via psycopg2) in turn and runs the command.

Backing up redshift database

I want to backup entire redshift cluster, such that I can use it in other databases like mysql or hadoop in future.
I was looking up and creating a manual screenshot seems to be an option but I guess that wont work for cross database languages.
So what would be the detailed steps to backup the entire cluster of redshift
Cluster backups can be done via the aws console, however these can only be restored to another redshift cluster.
Because Redshift is not the same as postgres in many ways, it will be inpossible / tricky to use standard tools like pg_dump and pg_restore.
I think that your best option is to :
extract the ddl from the Redshift tables that you wish to create elsewhere, most ide's have a simple way to do this.
modify the ddl to work with your target database (e.g. postgres will
be easy, mysql harder)
copy the contents of the Redshift database, one table at a time to s3 using
the unload command
import the data that you unloaded in step 3 to your target tables

Is there any way I can restore a DB2 backup file onto IBM DashDB?

I am trying to restore a DB2 backup file into my BlueMix DashDB service. How do I go about doing this?
You cannot restore your DB2 backup image into dashDB for several reasons.
In an entry-level, shared dashDB instance you only have access to one schema in a physical database shared by others.
Even if you have a dedicated instance, you need 1) access to the database local disk to upload the image and 2) sufficient privileges (at least SYSMAINT authority) to perform the restore. I doubt either will be available to you.
What you can do is run db2look and db2move locally to extract your database DDL statements and data respectively. You can then run the extracted DDL script against dashDB provided you replace the original schema name(s) with the one available to you in dashDB and, after creating the tables, load your data into them.

Backup and Restore Single Schema/Table

Is there a way to backup or restore a specific schema or table on a Cloud SQL server? Backing up the entire set of data, but then being able to restore only certain schemas or tables would be very helpful for multi-tenant systems.
Not via backup/restore. You can export a specific schema or table to Google Cloud Storage, but that's probably not what you're looking for.
The Cloud SQL use MySQL database with some limitations. You can check out the unsupported features and functions at the following link:
https://cloud.google.com/sql/faq#supportmysqlfeatures
With this in mind any backup/restore tool that are being used for MySQL should work for Google Cloud SQL as well. Using mysqldump for import/export of Cloud SQL is covered at this document:
https://cloud.google.com/sql/docs/import-export
You can use mysqldump to backup specific table or backup entire database and restore only specific tables using the solutions offered in these links:
Can I restore a single table from a full mysql mysqldump file?
How to take backup of a single table in a MySQL database?
You can restore the backup to a different instance and then replace the data on the original instance with the data from the backup instance.