Do local versions of postgres foreign data wrappers update automatically? - postgresql

Apologies if this is answered in the docs or somewhere else obvious, but I have just gone through the rigmorale of having to use postgres_fdw to get data from another database (quite different to the ease of MS SQL!)
I ran an IMPORT SCHEMA statement to get the foreign table/databse data into my working database. Will the data in this schema automatially update whenever the data in the foreign database updates or is this IMPORT SCHEMA a copy of the data?

A foreign table does not contain any data. Querying it will yield the current data in the remote table. Think of it as something like a view.
If you are talking about metadata changes (ALTER TABLE), the analogy to views is also helpful: such changes will not be reflected in the foreign table, and you need to run an ALTER FOREIGN TABLE statement.
Perhaps it would make sense to put the data for these databases in a single database in two different schemas.

Related

How to synchronise a foreign table with a local table?

I'm using the Oracle foreign data wrapper and would like to have local copies of some of my foreign tables locally. Is there another option than having materialized views and refreshing them manually?
Not really, unless you want to add functionality in Oracle:
If you add a trigger on the Oracle table that records all data modifications in another table, you could define a foreign table on that table. Then you can regularly run a function in PostgreSQL that takes the changes since you checked last time and applies them to a PostgreSQL table.
If you understand how “materialized view logs” work in Oracle (I don't, and I think the documentation doesn't tell), you could define a foreign table on that and use it like above. That might be cheaper.
Both of these ideas would still require you to regularly run something in PostgreSQL, but you might be cheaper. Perhaps (if you have the money) you could use Oracle Heterogenous Services to modify a PostgreSQL table whenever something changes in an Oracle table.

Importing existing table data to a new table in different database (postgres)

i would like to import all data of a existing table of one database to a new table present inside different database in postgres, any suggestions will be helpful.
The easiest way would be to pg_dump the table and pg_restore in the target database.
In case it is not an option, you should definitely take a look a postgres_fdw (Foreign Data Wrapper), which allows you to access data from different databases - even from different machines. It is slightly more complex than the traditional export/import approach, but it creates a direct connection to the foreign table.
Take a look at this example.

Enforcing Foreign Key Constraint Over Table From pg_dump With --exclude-table-data

I'm currently working on dumping one of our customer's database in a way that allows us to create new databases from this customer's basic structure, but without bringing along their private data.
So far, I've had success with pg_dump combined with the --exclude_table and exclude-table-data commands, which allowed me to bring only the data I'll effectively need for this task.
However, there are a few tables that mix lines which references some of the data I left behind with other lines that references data that I had to bring, and this is causing me a few issues during the restore operation. Specifically, when the dump tries to enforce FOREIGN KEY constraints for certain columns on these tables, it fails because there are some lines with keys that have no matching data on the respective foreign table - because I chose to not bring this table's data!
I know I can log into the database after the dump is complete, delete any rows that reference data that no longer exists and create the constraint myself, but I'd like to automate the process as much as possible. Is there a way to tell pg_dump or pg_restore (or any other program) to not bring rows from table A if they reference table B if and table B's data was excluded from the backup? Or to tell Postgres that I'd like to have that specific foreign key to be active before importing the table's data?
For reference, I'm working with PostgreSQL 9.2 on a HREL 7 server.
What if you disable foreign key checking when you restore your database dump? And after that remove lonely rows from the referring table.
By the way, I recommend you to fix you database schema so there is no chance wrong tuples being inserted into your database.

How would you connect to two postgres databases in a sql script or a stored procedure?

I need to move some old data from one database to another - both have similar schemas. After the old rows in db1.mytable are inserted into db2.mytable - these same rows should be deleted from db1.mytable.
This is reduce db size and archive data that is not really needed that much but still important.
You should to use Foreign Data Wrapper for Postgres - https://www.postgresql.org/docs/current/static/postgres-fdw.html With foreign tables, you can send a query to another database.

Postgresql archiving old data

I need some expert advice on Postgres
I have few tables in my database that can grow huge, may be a hundred million records and have to implement some sort of data archiving in place. Say I have a subscriber table and subscriber_logs table. The subscriber_logs table will grow huge with time, affecting performance. I wanted to create a separate table called archive_subscriber_logs and create a scheduled task which will read from subscriber_logs and insert the data into archive_subscriber_logs, then delete the dumped data from subscriber_logs.
But my concern is, should I create the archive_subscriber_logs in the same database or in a different database. The problem with storing in a different db is the foreign key constraints that already exists on the main tables.
Anyone can suggest whether same db or different db is preferable? Or any other solutions?
Consider table partitioning, which is implemented in Postgres using table inheritance. This will improve performance on very large tables. Of course you would do measurements first to make sure it is worth implementing. The details are in the excellent Postgres documentation.
Using separate databases is not recommended because you won't be able to have foreign key constraints easily.