How to migrate off SQLDB service on IBM Bluemix? - db2

IBM is pulling the SQLDB service on Bluemix. Does anybody know how I can move my stored data, what are my options in terms of service?

Migration Options and Information
The migration options which we suggest are Compose PostgreSQL or DB2 on Cloud. DB2 on Cloud is a self-managed offering. Compose PostgreSQL Enterprise is offered as fully-managed or self-managed while the multi-tenancy version is only offered as fully-managed. Compose will soon be delivered soon as an IBM Branded Service meaning that you will not need to have a separate account on Compose.io.
What are the plans for a free SQL Database Service? We are moving away from offering free SQL Database services. The Compose PostgreSQL multi-tenancy offering is a metered service so you pay for what you use. If you have minimal usage you will find the charges are nominal.
What tools do you recommend for data migration? We suggest looking at Dataworks Forge and Dataworks Lift as the tools to use for migration.
Steps to Migrate
Export DDL from SQLDB
Apply DDL from SQLDB to target without triggers, stored procedures, and UDFs. If you are using a tool like DataWorks Lift or DataWorks forge the DDL file will the input into the tool.
Migrate data from SQLDB to target.
Exporting DDL from SQLDB
Sign-in with IBM ID, download free version of db2client
- URL http://www-01.ibm.com/support/docview.wss?uid=swg21385217
Get VCAP information for SQL Database from Bluemix. Document the host name and database name.
On command line within db2client, execute the following commands:
- db2 catalog tcpip node "any_name_you_want" remote "publicipaddress" server 500003
- db2 catalog database "databasename" at node "the name from above"
- db2look -d "database name" -i "user name from VCAP" -w "password from VCAP" -e -o "output file"
The output file will contain the DDL from SQLDB

Related

Load data from Azure PostgreSQL to Azure Analysis Services (AAS)

I have a Azure database for PostgreSQL and want to set up my Azure Analysis Service with this Postgres on Azure as datasource.
I'm not sure whether PostgreSQL on Azure is supported by AAS? I am getting the below error when trying to connect:
Error
I tried with the below extension, but installing it did not resolve the issue:
Npgsql installed
My assumption is that I need to install the on-premise gateway to be able to connect this Azure PostgreSQL db - can anybody confirm this is the right direction and will resolve the connection issue?
I'm not sure whether PostgreSQL on Azure is supported by AAS? I need to install the on-premise gateway to be able to connect this Azure PostgreSQL db - can anybody confirm this is the right direction and will resolve the connection issue?
According to KranthiPakala:
Azure Database for PostgreSQL is not a supported data source of AAS.
You can connect to on-prem PostgreSQL as a data source for import or in-memory tabular models. In order to do this, select the ODBC data source option(start a new import connection , choose ODBC option, then pick the ODBC source name) and AAS will treat it like SQL Server.
References: Data sources supported in Azure Analysis Services and Specify provider data sources in tabular 1400 and higher model projects

Can we use pg_dumpall for Azure managed flexible Server

We are trying to devise a backup strategy for Azure Flexible Server. Can we use the native pgdumpall as a backup and restore mechanism, I am able to take a dumpall by excluding azure default database (as we are not super user in managed server)
pg_dumpall - Azure Database for PostgreSQL - permission denied for database "azure_maintenance"
Can we restore the dump that we create somehow in the managed postgres flexible server, I am unable to find any support in this regard.
You should be able to restore the dump to Flexible server. See if any of these docs help however for the link you shared Yes, we do not allow dump all as some of the internal DBs are not exposed to customers due to security.
Upgrade using dump and restore - Azure Database for PostgreSQL | Microsoft Docs
Dump and restore - Azure Database for PostgreSQL - Single Server | Microsoft Docs

How to change default database name BLUDB in IBM Db2 Warehouse on Cloud?

Whenever I create a Db2 service in IBM Cloud, it takes the default database name BLUBDB. I want to change to user specific name like TESTDB?
Most of service plans for Db2 Warehouse on Cloud (formerly dashDB), then there is only a single database and the name is preset to BLUDB for simplicity. If you want to have more control, you could go with Db2 Hosted on IBM Cloud.
Alternatively, if you are already locally cataloguing the database you could add an alias.
For example:
db2 catalog tcpip node mynode remote dashdb-myinstance.bluemix.net server 50000
db2 catalog database bludb as testdb at node mynode

Backup of Dashdb on Bluemix - what options available

I tried taking backup of dashdb from bluemix cloud using Data Studio. I am getting this error 'Remote target is unreachable.'.
Since this is an admin activity, I assume it should be done on the server. As this is cloud server, I am trying to understand how this can be done!
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
Thanks
Which plan are you using ?
Have you read the details of backup on the FAQ ?
This section might help:
Is my data backed up? Encrypted backups on the full Db2 managed service database are done daily. For the Db2 Warehouse on Cloud Flex Performance plan, the
last 7 daily backups are retained. For all other Db2 Warehouse on
Cloud plans, the last 2 daily backups are retained. For Db2 on Cloud,
the last 14 daily backups are retained. In the Db2 Warehouse on Cloud Flex Performance plan, you can restore your database from any of your retained backups at any time
that you choose. In the case of all of the other Db2 Warehouse on
Cloud plans, the retained backups are used exclusively by IBM for only
system recovery purposes in the event of a disaster or system loss. A
request to restore your database from a backup is not supported. You
can export your data using Db2 tools such as IBM Data Studio or by
using the db2 export command.>
For Db2 on Cloud, backups can be stored off site in a different data center or region upon request to IBM Support. These backups are
also used exclusively by IBM to recover from only disaster or system
loss events. A request to restore your database from a backup is not
supported.
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
SSH is not supported as this is a managed service. This is documented in the FAQ:
How do I access the Db2 managed service database?
You can access your Db2 managed service database through the web console from a browser, or you can connect to the database with a client connection such as JDBC, ODBC, CLI, CLP, or CLPPlus. A direct login to the server with Telnet or a Secure Shell (ssh) is not supported.
If you want to take your own backups at the interval of your choice, exporting your data is your best option. That can be done from the web console or from a database client.
Alternatively, if what you're after is access to historical data, you can use time travel query.

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.