Index are auto deleted after migration - postgresql

Migrating data from IBM DB2 z/os to Postgresql in AWS RDS , using Attunity replicate.
Following these steps:
Create table DDL
Create Indexes
Migrate data
Create foreign keys
Before migration all Indexes are created in postgresql DB and visible but after MIGRATION all indexes are getting auto deleted and we have to recreate that indexes.
Why it is happening ?? any other suggestion

Related

Update postgres RDS table with AWS glue script

I would like to transfer data from postgres RDS database tables to a new reporting database, also created as a postgres RDS. I create data catalogs and created a script that should join two tables together and then save data to a reporting database. It is working as intended only when run for the first time - it saves current data to new database. I need to update the reporting database daily with newly added records to a basic database, but it is not saving new data for me. Is there any way to insert only new data into a db with AWS Glue?

Is there any way to sync on premise oracle table data to a table in Aurora postgres RDS?

I've migrated a On premise Oracle to Aurora Postgres on AWS RDS. my migration process is described in the below image.
I'm looking for an option to sync data from on prem oracle to aurora postgres (Something like, if one row inserted/updated in particular table, that data should come and update/insert in aurora postgres table).
Is it possible ?
Yes you'll want to use the DMS (Database Migration Service) which can do continuous data replication from source to target DB's
https://aws.amazon.com/dms/

Can't drop an index in postgres Heroku - receive ERROR: index "ix_public_jobs_next_run_time" does not exist

I have a table in Heroku Postgres (Hobby tier) with a "jobs" table. I am using PGAdmin to view and work with the database.
If I view the dependents tab for the "jobs" table, I can see that an index exists "public.ix_public_jobs_next_run_time".
From the query tool, I run the query "DROP INDEX public.ix_public_jobs_next_run_time;" and get the following error:
ERROR: index "ix_public_jobs_next_run_time" does not exist
SQL state: 42704
Why can't I drop this index?
Background: I am using SQLAlchemy ORM to db upgrade my postgres database to modify some tables. The db upgrade command fails when it tries to drop the index. I used the steps above to recreate this error.
Perhaps you set a search_path that does not include public. Then you have to name the schema explicitly:
DROP INDEX public.ix_public_jobs_next_run_time;
Another option is of course that you connected to the wrong database when you tried to drop the index.

Inserting differntial data into postgresql table from db2 table trough trigger

I am looking the differential data load from db2 tables to PostgreSQL. Is there any free tool? or how I can write the trigger to connect two databases?
will db2_fdw in PostgreSQL will help me to do this?
You can't write to nicknames in triggers in Db2.
You may use SQL Replication to a federated data source (PostgreSQL in you case) depending on your Db2 version.
11.5 Federation for PG
11.1 Federation for PG

database connection through spring boot in PostgreSQL

I am connecting spring boot application to PostgreSQL database. I have created the table using "create query tool" of PostgreSQL. It has been put in the default public schema. But when I tried to insert values in table, it does not accept values.
I have used the following command:
Insert into User values(1,"ABC","pass_1");
What is the proper way to insert?