Debezium Connector for RDS Aurora - apache-kafka

I'm trying to use debezium with rds/aurora/ and I'm thinking which connector to use MySQL connector or there is another connector for Aurora? Also, how can i connect debezium connector(localhost) to remote aws aurora db? Also, is someone using debezium with aurora to share some info.
Also, how can i configure debezium to write into different kafka topics the different tables which we are monitoring?
Regards

While creating AWS Aurora instance you must have chosen between
Amazon Aurora with MySQL compatibility
Amazon Aurora with PostgreSQL compatibility
Based on this you can choose between MySQL or Postgres connector for Debezium. There is no separate connector for Aurora.
how can i connect debezium connector(localhost) to remote aws aurora db?
You need to configure connector with database.hostname and database.port
Refer - https://debezium.io/documentation/reference/1.0/connectors/index.html

Related

Connect Azure PostgreSQL DB From HDinsights

Is there a way where we can connect Azure PostgreSQL DB From HDinsights cluster.
I can see we have an option to have custom metastore when creating HDInsights cluster. But would like to know if there is a way where we can connect Azure PostgreSQL DB From HDinsights cluster (Apart from PostgreSQL JAR) to load some data using spark

DebeziumIO read with PostgresSQL to connect to RDS is not streaming data in Apache Beam

I have created apache beam pipeline which uses DebeziumIO - PostgreSQLconnector class to connect to postgresql in rds. The setup is working for local postgresql database with replication set to logical. AWS is setup with replication to logical and rds_replication role set to the user.
But the data stream is not captured by the pipeline.
I have checked with the embedded debezium with the postgresql connector class and it captures the change data.

How can I set up a CDC replication from AWS RDS PostgreSQL to Kafka using Pglogical?

Is it possible to setup replication from PostgreSQL to Kafka using PGlogical?
Currently we are using attunity as our replication tool which has been causing issues on our instances.
We are using RDS PostgreSQL.

Sending data mssql server to MySQL using Kafka connect

I need to deploy apache-kafka-connect in kubernetes. I have following questions:
Do I need to create MS SQL Server db tables in MySQL database? Or Kafka connect will create?
Any reference how to implement Kafka Connect without Avro Schema registry?
How to give configurations of key and value convertors?

AWS DMS to migrate FROM RDS Postgres

can I migrate from AWS RDS to standalone Postgres instance using AWS DMS?
I RTFM, but It does not state anywhere clearly If I can or not. In theory migration should be the same - create supporting scheme in RDS and move on. But have anyone done it?
Well, from AWS DMS manual:
AWS Database Migration Service (AWS DMS) can migrate your data to and from most widely used commercial and open-source databases such as Oracle, PostgreSQL, Microsoft SQL Server, Amazon Redshift, Amazon Aurora, MariaDB, and MySQL. The service supports homogeneous migrations such as Oracle to Oracle, and also heterogeneous migrations between different database platforms, such as Oracle to MySQL or MySQL to Amazon Aurora. The source or target database must be on an AWS service.
In your case the source is on an AWS service, and if by "standalone" you mean a PostgreSQL instance on EC2 machine, then your target is as well. So, based on that, then answer should be "yes".
Yes, that is possible. We have moved data from RDS Postgres to Postgres Installed on EC2 Instance using DMS and works great so far. Make sure to create endpoints for the source/target and create a replication instance and you are good to go