I tried taking backup of dashdb from bluemix cloud using Data Studio. I am getting this error 'Remote target is unreachable.'.
Since this is an admin activity, I assume it should be done on the server. As this is cloud server, I am trying to understand how this can be done!
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
Thanks
Which plan are you using ?
Have you read the details of backup on the FAQ ?
This section might help:
Is my data backed up? Encrypted backups on the full Db2 managed service database are done daily. For the Db2 Warehouse on Cloud Flex Performance plan, the
last 7 daily backups are retained. For all other Db2 Warehouse on
Cloud plans, the last 2 daily backups are retained. For Db2 on Cloud,
the last 14 daily backups are retained. In the Db2 Warehouse on Cloud Flex Performance plan, you can restore your database from any of your retained backups at any time
that you choose. In the case of all of the other Db2 Warehouse on
Cloud plans, the retained backups are used exclusively by IBM for only
system recovery purposes in the event of a disaster or system loss. A
request to restore your database from a backup is not supported. You
can export your data using Db2 tools such as IBM Data Studio or by
using the db2 export command.>
For Db2 on Cloud, backups can be stored off site in a different data center or region upon request to IBM Support. These backups are
also used exclusively by IBM to recover from only disaster or system
loss events. A request to restore your database from a backup is not
supported.
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
SSH is not supported as this is a managed service. This is documented in the FAQ:
How do I access the Db2 managed service database?
You can access your Db2 managed service database through the web console from a browser, or you can connect to the database with a client connection such as JDBC, ODBC, CLI, CLP, or CLPPlus. A direct login to the server with Telnet or a Secure Shell (ssh) is not supported.
If you want to take your own backups at the interval of your choice, exporting your data is your best option. That can be done from the web console or from a database client.
Alternatively, if what you're after is access to historical data, you can use time travel query.
Related
We are trying to devise a backup strategy for Azure Flexible Server. Can we use the native pgdumpall as a backup and restore mechanism, I am able to take a dumpall by excluding azure default database (as we are not super user in managed server)
pg_dumpall - Azure Database for PostgreSQL - permission denied for database "azure_maintenance"
Can we restore the dump that we create somehow in the managed postgres flexible server, I am unable to find any support in this regard.
You should be able to restore the dump to Flexible server. See if any of these docs help however for the link you shared Yes, we do not allow dump all as some of the internal DBs are not exposed to customers due to security.
Upgrade using dump and restore - Azure Database for PostgreSQL | Microsoft Docs
Dump and restore - Azure Database for PostgreSQL - Single Server | Microsoft Docs
I have a C# ASP.NET MVC based web application in which there is a functionality of Archiving (backup) and Restoring Databases on the click of button. Currently, our system is On-Prem and so we are using the stored procedures in SQL Server to do the backup and restore.
Now we are migrating to Azure PaaS and are using Elastic Pool. As the backup and restore T-SQL commands do not work in Azure SQL, so can someone please help me in finding a way to do the backup and restore of DBs manually through T-SQL?
Any help would be highly appreciated!
Auto backups is a feature of Azure SQL database.
In SQL Azure, database backups are executed automatically and it is not possible to change this behavior. This is a service offered when you create a SQL Azure database and the first full backup occurs immediately after you create a new SQL Azure database and the rest of the backups are scheduled by SQL Azure itself.
There is no way can help you backup and restore of DBs manually through T-SQL.
Hope this helps.
Thanks for the quick answer,Leon! :)
So that it could be helpful to other community members, I would like to mention another way to achieve the same as it can't be done by T-SQL.
What I have done instead is, I have used SQLPackage.exe utility embedded in a batch script and executed that batch script through my C# code to achieve the same functionality.
We need to check in client app if OrientDB is creating backup.
Is there any way to check?
OrientDB supports backup and and restore operations, like any database management system.
The Backup can be performed by running the BACKUP DATABASE command. It's possible to automatize backups using the Automatic Backup server plugin, by modifying the
orientdb-server-config.xml
you should see the scheduled time in the server, like this one:
If you didn't modified the correct parameter in this file, you shouldn't have automatic backup enabled.
In case if you're using the Enterprise Edition you can check backup settings in the Server Management Area.
Hope it helps
Regards
IBM is pulling the SQLDB service on Bluemix. Does anybody know how I can move my stored data, what are my options in terms of service?
Migration Options and Information
The migration options which we suggest are Compose PostgreSQL or DB2 on Cloud. DB2 on Cloud is a self-managed offering. Compose PostgreSQL Enterprise is offered as fully-managed or self-managed while the multi-tenancy version is only offered as fully-managed. Compose will soon be delivered soon as an IBM Branded Service meaning that you will not need to have a separate account on Compose.io.
What are the plans for a free SQL Database Service? We are moving away from offering free SQL Database services. The Compose PostgreSQL multi-tenancy offering is a metered service so you pay for what you use. If you have minimal usage you will find the charges are nominal.
What tools do you recommend for data migration? We suggest looking at Dataworks Forge and Dataworks Lift as the tools to use for migration.
Steps to Migrate
Export DDL from SQLDB
Apply DDL from SQLDB to target without triggers, stored procedures, and UDFs. If you are using a tool like DataWorks Lift or DataWorks forge the DDL file will the input into the tool.
Migrate data from SQLDB to target.
Exporting DDL from SQLDB
Sign-in with IBM ID, download free version of db2client
- URL http://www-01.ibm.com/support/docview.wss?uid=swg21385217
Get VCAP information for SQL Database from Bluemix. Document the host name and database name.
On command line within db2client, execute the following commands:
- db2 catalog tcpip node "any_name_you_want" remote "publicipaddress" server 500003
- db2 catalog database "databasename" at node "the name from above"
- db2look -d "database name" -i "user name from VCAP" -w "password from VCAP" -e -o "output file"
The output file will contain the DDL from SQLDB
I am trying to restore a DB2 backup file into my BlueMix DashDB service. How do I go about doing this?
You cannot restore your DB2 backup image into dashDB for several reasons.
In an entry-level, shared dashDB instance you only have access to one schema in a physical database shared by others.
Even if you have a dedicated instance, you need 1) access to the database local disk to upload the image and 2) sufficient privileges (at least SYSMAINT authority) to perform the restore. I doubt either will be available to you.
What you can do is run db2look and db2move locally to extract your database DDL statements and data respectively. You can then run the extracted DDL script against dashDB provided you replace the original schema name(s) with the one available to you in dashDB and, after creating the tables, load your data into them.