Is it possible to import postgresql custom-format dump file in cloud sql without using pg_restore? - postgresql

I downloaded the postgresql .dmp file from the chembl database.
I want to import this into gcp cloudsql.
When I run it with the console and gcloud command, I get the following error:
Importing data into Cloud SQL instance...failed.
ERROR: (gcloud.sql.import.sql) [ERROR_RDBMS] exit status 1
The input is a PostgreSQL custom-format dump.
Use the pg_restore command-line client to restore this dump to a database.
Can I import custom-format dmp files without using the pg_restore command?
https://cloud.google.com/sql/docs/postgres/import-export/importing
There is a description of pg_restore in the document on that site, but I didn't get it right.
In the case of custom-format files, is it necessary to pg_restore after uploading them to the cloud shell?

According to the CloudSQL docs:
Only plain SQL format is supported by the Cloud SQL Admin API.
The custom format is allowed if the dump file is intended for use with pg_restore.
If you cannot use pg_restore for some reason, I would spin up a local Postgres instance (i.e., on your laptop) and use pg_restore to restore the database.
After loading into your local database, you can use pg_dump to dump to file in plaintext format, then load into CloudSQL with the console or gcloud command.

Related

restore db postgres - format .gz

I am trying to restore a database on my local machine. I downloaded this database from the server in file.gz format. Inside this archive is the file file.out. How can I restore the entire structure on my computer through
pg_admin or via console?
If I understand correctly, you need to use the pg_restore command. But all my attempts end with a syntax error.

How to automate using a production postgres database backup in local Flask environment

We use Postgres and Flask for our website, and we use the production database dump locally pretty often. To get a fresh dump, I use a remote desktop connection (RDC) to connect to pgAdmin then use RDC again to copy .bak file from server and save it locally. Likewise, I use a local instance of pdAdmin to restore the database state from the backup.
My manager asked me to automate this process to use production database each time when a local Flask instance is launched. How can I do that?
You could write a shell script that dumps the database to a local file using pg_dump, then use pg_restore to build a new local database from that dump. You could probably even just pass the output from pg_dump to pg_restore... something like
pg_dump --host <remote-database-host> --dbname <remote-database-name> --username <remote-username> > pg_restore --host <local-database-host> --username <local-username>
To get your password into pg_dump / pg_restore you'll probably want to use a .pgpass file, as described here: How to pass in password to pg_dump?
If you want this to happen automatically when you launch a Flask instance locally, you could call the shell script from your initialization code using a subprocess call if a LOCAL_INSTANCE environment variable is set, or something along those lines

How to export result set as CSV from Aurora Postgres DB to AWS-S3?

As part of my Flask and Celery application, I'm trying to move data from AWS-Aurora Postgres DB to Redshift.
I'll be running this application in Kubernetes.
My approach is to query the aurora Postgres database and write the result set to a CSV file which is saved on to an attached volume and then upload it to S3 and then import the file into Redshift.
However, I came across another article which lets us directly upload the result set as a CSV file to S3 instead of having an intermediate volume.
https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html
They've mentioned the usage of OUTFILE command. But it's mentioned about MySQL. But they haven't mentioned anything about Postgres DB.
Is it even possible to use the command on Aurora Postgres DB and export to S3.
If you can connect to the database with psql, you can use the \copy command to export the output from any select statement to a csv:
https://codeburst.io/two-handy-examples-of-the-psql-copy-meta-command-2feaefd5dd90
https://dba.stackexchange.com/questions/7651/postgres-client-copy-copy-command-doesnt-have-access-to-a-temporary-table
Yes, Aurora runs on MySQL so you can use the outfile command. Did you even try running a query with outfile?

Export Database from Google Cloud Sql to external Database

I'm trying to export my database created in Google Cloud Sql and import it into a new external server.
I tried to create a sql backup through the google console, downloaded it and copied it to the new server via filezilla and then launched the following command:
psql -U postgres -d ciclods-db -1 -f Backup-db_Cloud_SQL_Export_2019-03-23\ \(17_01_19\)
but i get this output:
ERROR: role "cloudsqladmin" does not exist
REVOKE
ERROR: role
"cloudsqlsuperuser" does not exist GRANT
what is the right procedure to follow in these cases?
I have resolved the same problem by locating and deleting the two lines from the exported sql file with "cloudsqladmin". My app does not use it anyway.
to do this task you can follow the official GCP guide about How to export data from Cloud SQL[1] in that document they give you the option to export the data into a dump file or csv files which can be used for other tools.
https://cloud.google.com/sql/docs/mysql/import-export/exporting
In order to create the export file, you have to do it from a command line and use additional flags. As per documentation‘s “Exporting data to a SQL dump file”, there is a section on Exporting data from an externally-managed database server.
As well you can find there the option to export the data into a CSV file.

Dump Contents of RDS Postgres Query

Short Version of this Question:
I'd like to dump the contents of a Postgres query from a db instance hosted in RDS inside of a shell script.
Complete Version:
Right now I'm writing a shell script that I would like to dump the contents of a query into a .dump file from a source database, and run the dump file on a destination database instance. Both db instances are hosted in RDS.
MySQL allows you to do this using the mysqldump tool, but the recommended answer to this problem in Postgres seems to be to use the COPY command. However, the COPY command isn't available in RDS instances. The recommended solution in this case seems to be to use the '\copy' command, which does the same thing locally using the psql tool. However, it doesn't seem like this is a support option inside of a shell script.
What's the best way to accomplish this?
Thank you!
I am not familiar with shell, but I have used batch file in Windows to dump output of query to a file and to import the file on another instance.
Here is what I used to export from postgres RDS to file on Windows.
SET PGPASSWORD=your_password
cd "C:\Program Files (x86)\pgAdmin 4\v3\runtime"
psql -h your_host -U your_username -d your_databasename -c "\copy (your_query) TO
path\file_name.sql"
All above commands are in one batch file.