pgDump for only the latest tables - postgresql

I have a service that creates a new table in a DB when a REST API is invoked. I am trying to continuously replicate this DB. I was thinking of using pgDump on a loop in order to get the new table schema, and pgLogical for on-going replication of the new table.
Is there an option in pgDump that allows me to list only the latest table created?

Related

Migrating partitioned table from postgres to Redshift with pglogical

I've created a DMS task of CDC and Full Load, migrating data from postgres 14 to Redshift. According to the documentation, when using pglogical and creating postgres publication with 'publish_via_partition_root' parameter of my partitioned table, changes should be published to the parent table and to to child tables. However, the data is still migrated to the child tables in Redshift and not to the parent table. Am I missing something thats needs to be configured or is it just not possible in DMS?

Migrate Tables from one one RDS Postgres schema to another within the same DB Instance

I have a use case where I am splitting one service into multiple and want to migrate tables (with huge data) from one RDS Postgres schema to another within the same DB Instance with ongoing replication and ~zero downtime, I am exploring AWS DMS service, I can see it is possible to migrate the entire DB, is it possible to migrate only a specific schema and how?
Using alter table query is not an option because I cannot move the table in one shot in production, it needs to happen gradually. An AWS DMS-like solution will fit the use-case.
Thanks in advance.
Moving a table from one schema to another schema, can be done by just altering the schema:
ALTER TABLE [ IF EXISTS ] name
SET SCHEMA new_schema

Recreate SQL Commands from db

I have created a db long ago using django. Now as we are migrating the application, so I need all the CREATE TABLE sql queries which django might have run to create the entire db for our service (which has around 70-80 tables and each table has avg 30-70 columns).
Both the servers old and new are using Postgres for databases.
But the technology stack is completely different (A 3rd party proprietary application which will host the service) instead of django.
If I start to write all the tables again from scratch, it will take at least a week or two.
Is there any way either from Postgres or from django which can generate the CREATE TABLE sql schema for an entire db keeping all the relationship as is?
Also, I have to do minor modification to that schema as per customer requirement.
p.s - pg_dump won't work as I need actual schema itself to get it reviewed from client.

Data Migration from one DB to another

I have to create an app which transfer data from snowflake to postgres everyday. Some tables in postgres are truncated before migration and all data from corresponding snowflake table is copied. While for other tables, data after last timestamp in postgres is copied from snowflake.
This job has to run at night sometime and not when customers are using the service at daytime.
What is the best way to do this ?
Do you have constraints, limiting your choices in:
ETL or bulk data tooling
Development languages?
According to this site, you can create a foreign data wrapper on Postgresql for snowflake

Importing data from MS Access db to PostgreSQL db

I have a table in an MS Access db that I want to export to a PostgreSQL database. Every 2 or so months, I want to move all records from the Access table to a table in Postgres.
Right now, I am using the Export to ODBC option in Access to do this, but every time it exports as an entirely new table in Postgres. Is there a way for me to routinely append the records in the Access table to an existing table in my Postgres database? I have come across the option of a FDW but I am not familiar with how to install/use it.
I am new to using PostgreSQL, and have little to no experience working with databases other than Access, so any input/advice would be greatly appreciated.