How to export and import data from a Postgres database with Hasura? - postgresql

I have Hasura GraphQL engine running on a Docker container. My goal is to export all the data from my Postgres database so that my coworker can import it and work with the same data. What is the correct way to do this with Hasura?

You can totally do like the normal database on docker, nothing different, you can use any kind of dump or import/export.
You should understand the way Hasura works, it just take database as an input, Hasura does not take the database installation process.

Related

Copy data from Postgres DB (GCP Project A) to another Postgres DB (GCP Project B)

I would be happy to get your help / feedback re data load.
Goal:
Load source data from a Postgres database, which is located in GCP project A to another Postgres database, which is located in GCP project B.
Challenge:
Get a connection (I have an IAM account with sufficient rights to run a COPY TO / COPY FROM command) to the Postgres DB in GCP Project A and copy the table either to a CSV or create a dump that can be used in order to be inserted to another Postgres DB in GCP Project B.
How do I connect to the database (e.g. if I create a key, where shall I store the json keyfile and would that approach even be feasible?) with this IAM email account?
Other ways I've researched were to use psycopg2 (thus I could use the function cursor.copy_expert (which doesn’t need any superuser right or Postgres user credentials and copy the data), but I didn’t succeed in connecting to the database with psycopg2 due to challenges with cloud proxy.
Another idea was to use pg_dump or gcloud sql export csv.
I would be curious if some of you were facing a similar challenge and how did you solve it and what might be the best way/practice
You can have a try out database migration service. You can set up a continuous migration configuration and use Cloud SQL for PostgreSQL.
Hello after a lot of searching I've come to these solutions:
If you have continuous copy, you need to use the database migration service, check this documentation.
If you have one shot copy:
you can restore your instance, see the bottom page of this documentation
you can create a bucket and backup your instance on it, then import it from the other project

Is there any way of of get backup of postgres db in docker container without generating sql file?

I am new in Docker, I want to get a backup for my postgress database running in docker. All solutions i saw are offering to generate a dump sql script and restore db with running this script. But i dont want to do this? Is it possible backup and restore by migrating binary files of the db?
You can build Postgres image from plain empty Postgres db image. In Dockerfile you add SQL script which runs on db initialization (docker-entrypoint-initdb.d). The SQL script contains dblink to your backed up db and commands create table my_table as select * from my_table#remotedb. After docker build you have image with backup of your original database tables.
I do something similar with Oracle with more complexity (copying only subset of original database, preserving indexes etc.). Oracle docker image differs from PG in some properties but I believe the rough idea is applicable. It is some time ago I worked with PG so I won't advise you how to migrate binary files (though I believe it would be possible too).

Is there a way to have access my postgreSQL database online like MongoDB?

I had to migrate my database from MongoDB to PostgreSQL.
The inconvenient is that everytime I add modifications in my local database, I have to go through a backup file creation and import it on my Heroku postgreSQL.
I would like to reduce the steps.
Ideally I would like to have an online database so that I won't have to worry about data migration, like it was for MongoDB.
Is there any solution like this ?

Dumping a DB without pg_dump

Is there any way to dump a postgres db using psql only ( without pg_dump )?
Thanks.
Theoretically you have access to all the data needed. In practice you're more likely to be able to dump/save some data using COPY command, but not the database schema, etc.
Note, that you do not have to have pg_dump on the same machine where your database server is, if it listens to the network. But well, I don't know why you even ask :)
In theory you could run queries to extract the schema and then use those results to extract the data. But it wouldn't be easy to manipulate all of that into something usable for a restore using just psql.

can we get the postgres db dump using SQLAlchemy?

Is it possible to have the postgres database dump(pg_dump) using SQLAlchemy? i can get the dump using pg_dump but I am doing all other db operations using SQLALchemy and thus want to know if this dump operation is also opssible using SQLAlchemy. Any suggestion, link would be of great help.
Thanks,
Tara Singh
pg_dump is a system command.so I do not think you could have postgres database dump using SQLAlchemy.
SqlAlchemy do not manage sort of pg_dump. You probably can mimic it with a buch of queries but it will be painfull.
The more easy way is to use pg_dump itself inside a python script with os.system or subprocess.call
If it's for regular saves also have a look to safekeep project who speak for you to your databases