Dump Just the table and reimport it - postgresql - postgresql

Just as the title states, I have a table that has well over 25 million records and I need all those records. it will only grow over time, so I need to export this one table from a psql database and import it into another psql database which is used for development.
Ideas?
I know you can dump a whole database, but can you dump a table? (sorry if thats a dumb question)

You can use:
$ pg_dump -d db -t big_table > big_table.sql

Related

How to import rows from only one table from pgAdmin db to table in Heroku Postgres db?

I have table Films with data in pgAdmin db(local Postgres db). And I want to import this data from Films table to the same table Films in Heroku Postgres db. I am using SQLAlchemy and Flask. I have read here(https://devcenter.heroku.com/articles/heroku-postgres-import-export) that it can be done through the console. I even tried it, but without any success. I am going to do this by write all data from Films into csv file and then copy it from csv file to Films on Heroku Postgres db, but is there a better way to do what i want? If exists, please give me understandable example. I will be very appreciative.
P.S. I tried create table dump: pg_dump -Fc --no-acl --no-owner -h localhost -U oleksiy -t films --data-only fm_bot > table.dump
But I don't understand next step: Generate a signed URL using the aws console - aws s3 presign s3://your-bucket-address/your-object
What is "your-bucket-address", what is "your-object"? And the main question: is it that I need?

Update Postgres table from a pg_dump dump?

I have 2 Postgres tables (same schema) in different servers.
One table has the the latest data, and the other one has old data that must be updated with the latest data.
So I dump the table with the latest data:
pg_dump --table=table00 --data-only db00 > table00.sql
or
pg_dump --table=table00 --data-only --column-inserts db00 > table00.sql
But now when I want to read in the SQL dump, I get errors about duplicate keys.
psql postgresql://<username>:<password>#localhost:5432/db00 --file=table00.sql
The error goes away if I drop the table with the old data first, but not only is this undesirable, it is plain silly.
How can I update a Postgres table from a SQL dump then?
Eg. it'd be nice if pg_dump had a --column-updates option, where, instead of INSERT statements, you got INSERT ON CONFLICT (column) DO UPDATE SET statements...

How can I copy my whole postgres database to another postgres one?

I have two postgres databases on the same server, a live one and another one which I use as a test database.
Periodically I need to copy my live database (both structure of the tables and their data) into the test one, but everything I've found online only copies table by table. Since I often create new tables in my live db I can't do this, otherwise I'd have to update the job every time.
Anybody knows how can I pull the whole live postgres db into the test postgres one?
Thanks!
You want to use pg_dump and pg_restore.
Example usage would be:
$ pg_dump -Fc <database_name> > dumpfile
$ pg_restore <another_database_name> < dumpfile
If you want to do this for all database you might want to consider pg_dumpall
A bit more can be found in https://www.postgresql.org/docs/current/backup-dump.html

PostgreSQL - copy data from one table, database, server to another table, another database, server

What would be the best way to copy data from one table, one database, one server to the table in another database, another server in PostgreSQL?
pg_dump allows the dumping of only select tables:
pg_dump -Fc -f output.dump -t tablename databasename
(dump 'tablename' from database 'databasename' into file 'output.dump' in pg_dumps binary custom format)
You can restore that dump on your other server with pg_restore:
pg_restore -d databasename output.dump
If the table itself already exists in your target database, you can import only the rows by adding the --data-only flag.
I shared a shell to copy table from one server to another PostgreSQL server.
Please refer this another stack question.
Copying PostgreSQL database to another server

How to merge dump into database from PostgreSQL?

I'm working on the same databse schema on different machines (PostgreSQL). I would like to know, how to merge data from one machine to another. Schema has many tables (around 10). What I want to achieve?
Dump data from machine A to file A.dmp
Restore data from file A.dmp to machine B
When a record already exists in machine B, I would like to don't insert it into machine B.
I was trying dump data from machine A to simple SQL inserts commands, but when I'm trying to restore it, I am getting duplicate key errors. What is more, I would like to restore data from command line (I have to import 250 MB data), because now I'm trying to do it manually with pgAdmin.
What is the best way to do it?
I finally did it this way:
Export to dump with:
pg_dump -f dumpfile.sql --column-inserts -a -n <schema> -U <username> <dbname>
Set skip unique for all tables
CREATE OR REPLACE RULE skip_unique AS ON INSERT TO <table>
WHERE (EXISTS (SELECT 1 FROM <table> WHERE users.id = new.id))
DO INSTEAD NOTHING
Import with psql
\i <dumpfile.sql>