export schema from Autonomous DB with apex workload - apex

I have an Autonomous DB on OCI with apex workload, so no wallet and no chance to connect with SQL developer nor any other client.
expdp command in cloud shell needs should be the only way but again the apex worload doesn't allow wallet connections.
Any other way to export a schema?
Thanks
In DB actions I can only import schemas.

Related

Use Terraform on Google Cloud SQL Postgres to create a Replication Slot

Overall I'm trying to create a Datastream Connection to a Postgres database in Cloud SQL.
As I'm trying to configure it all through Terraform, I'm stuck on how I should create a Replication Slot. This guide explains how to do it through the Postgres Client and running SQL commands, but I thought there might be a way to do it in the Terraform configuration directly.
Example SQL that I would like to replicate in Terraform:
ALTER USER [CURRENT_USER] WITH REPLICATION;
CREATE PUBLICATION [PUBLICATION_NAME] FOR ALL TABLES;
SELECT PG_CREATE_LOGICAL_REPLICATION_SLOT('[REPLICATION_SLOT_NAME]', 'pgoutput');
If not, does anyone know how to run the Postgres SQL commands against the Cloud SQL database through Terraform?
I have setup the Datastream and Postgres connection for all other parts. I'm expecting that there is a Terraform setting I'm missing or a way to run Postgres commands against the Google Cloud SQL Postgres database.
Unfortunately, there is no terraform resource for specifying a replication slot on a google_sql_database_instance.

Pool Cloud SQL PostgreSQL connections using PgBouncer

We are new to Google Cloud SQL and have been trying to integrate pgbouncer with Google Cloud SQl Postgres and authenticate database users with SECURITY DEFINER function (which queries pg_shadow)
Our Configuration:
Server -> Pgbouncer + Cloud sql proxy (side car) -> Cloud SQL Postgres
Problem:
But as cloud sql postgres actually does not allow to read pg_shadow from a privileged user (i.e postgres user is not a superuser). This makes it impossible to setup pgbouncer with SECURITY DEFINER function.
Cloud SQL doesn't provide customers to use superuser (cloudsqladmin)
We've read through many articles (mostly cloud-proxy issues) where they have suggested to use pgbouncer but have not elaborated on the above problem.
Options not applicable:
Application level pooling (not feasible right now for us)
Authenticating using auth_file eg. users_list.txt (not recomended, needs manual management of database users)
What we are looking for:
We intend to run a single instance of cloudsql-proxy and pgbouncer which proxies and pools connections to cloudsql postgres database.
We would appreciate your help guys!

PostgreSQL data migration from one database to other database in AWS

I have my Old postgres database which is not a cloud based. And I want to migrate the data from the old database to new database which is in aws.
So can this be done via dblink or what is the other best practises to do this.
You can migrate DBs to AWS via AWS Database Migration Service. It's fully managed tool to help you move your data from on premises to AWS. You can read more about it here: https://aws.amazon.com/dms/?nc=sn&loc=1.

Backup of Dashdb on Bluemix - what options available

I tried taking backup of dashdb from bluemix cloud using Data Studio. I am getting this error 'Remote target is unreachable.'.
Since this is an admin activity, I assume it should be done on the server. As this is cloud server, I am trying to understand how this can be done!
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
Thanks
Which plan are you using ?
Have you read the details of backup on the FAQ ?
This section might help:
Is my data backed up? Encrypted backups on the full Db2 managed service database are done daily. For the Db2 Warehouse on Cloud Flex Performance plan, the
last 7 daily backups are retained. For all other Db2 Warehouse on
Cloud plans, the last 2 daily backups are retained. For Db2 on Cloud,
the last 14 daily backups are retained. In the Db2 Warehouse on Cloud Flex Performance plan, you can restore your database from any of your retained backups at any time
that you choose. In the case of all of the other Db2 Warehouse on
Cloud plans, the retained backups are used exclusively by IBM for only
system recovery purposes in the event of a disaster or system loss. A
request to restore your database from a backup is not supported. You
can export your data using Db2 tools such as IBM Data Studio or by
using the db2 export command.>
For Db2 on Cloud, backups can be stored off site in a different data center or region upon request to IBM Support. These backups are
also used exclusively by IBM to recover from only disaster or system
loss events. A request to restore your database from a backup is not
supported.
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
SSH is not supported as this is a managed service. This is documented in the FAQ:
How do I access the Db2 managed service database?
You can access your Db2 managed service database through the web console from a browser, or you can connect to the database with a client connection such as JDBC, ODBC, CLI, CLP, or CLPPlus. A direct login to the server with Telnet or a Secure Shell (ssh) is not supported.
If you want to take your own backups at the interval of your choice, exporting your data is your best option. That can be done from the web console or from a database client.
Alternatively, if what you're after is access to historical data, you can use time travel query.

How to migrate off SQLDB service on IBM Bluemix?

IBM is pulling the SQLDB service on Bluemix. Does anybody know how I can move my stored data, what are my options in terms of service?
Migration Options and Information
The migration options which we suggest are Compose PostgreSQL or DB2 on Cloud. DB2 on Cloud is a self-managed offering. Compose PostgreSQL Enterprise is offered as fully-managed or self-managed while the multi-tenancy version is only offered as fully-managed. Compose will soon be delivered soon as an IBM Branded Service meaning that you will not need to have a separate account on Compose.io.
What are the plans for a free SQL Database Service? We are moving away from offering free SQL Database services. The Compose PostgreSQL multi-tenancy offering is a metered service so you pay for what you use. If you have minimal usage you will find the charges are nominal.
What tools do you recommend for data migration? We suggest looking at Dataworks Forge and Dataworks Lift as the tools to use for migration.
Steps to Migrate
Export DDL from SQLDB
Apply DDL from SQLDB to target without triggers, stored procedures, and UDFs. If you are using a tool like DataWorks Lift or DataWorks forge the DDL file will the input into the tool.
Migrate data from SQLDB to target.
Exporting DDL from SQLDB
Sign-in with IBM ID, download free version of db2client
- URL http://www-01.ibm.com/support/docview.wss?uid=swg21385217
Get VCAP information for SQL Database from Bluemix. Document the host name and database name.
On command line within db2client, execute the following commands:
- db2 catalog tcpip node "any_name_you_want" remote "publicipaddress" server 500003
- db2 catalog database "databasename" at node "the name from above"
- db2look -d "database name" -i "user name from VCAP" -w "password from VCAP" -e -o "output file"
The output file will contain the DDL from SQLDB