Debezium-contains no connector type - apache-kafka

I am trying to use Debezium to connect to a mysql database on my local machine.
Trying with the following command to call kafka:
sudo kafka/bin/connect-standalone.shsh kafka/config/connect-standalone.properties kafka/config/connector.properties
Here is the config in connector.properties:
{
"name": "inventory-connector",
"config": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
"database.hostname": "127.0.0.1",
"tasks.max": "1",
"database.port": "3306",
"database.user": "debezium",
"database.password": "Password#123",
"database.server.id": "1",
"database.server.name": "fullfillment",
"database.whitelist": "inventory",
"database.history.kafka.bootstrap.servers": "localhost:9092",
"database.history.kafka.topic": "dbhistory.fullfillment",
"include.schema.changes": "true",
"type": "null"
}
}
Getting the following error while running the mentioned command:
[2018-12-07 10:58:17,102] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {"config"={, "type"="null", "database.user"="debezium",, "database.port"="3306",, "include.schema.changes"="true",, "database.server.name"="fullfillment",, "connector.class"="io.debezium.connector.mysql.MySqlConnector",, "tasks.max"="1",, "database.history.kafka.topic"="dbhistory.fullfillment",, "database.server.id"="1",, "database.whitelist"="inventory",, "name"="inventory-connector",, "database.hostname"="127.0.0.1",, {=, "database.password"="Password#123",, }=, "database.history.kafka.bootstrap.servers"="localhost:9092",} contains no connector type
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {"config"={, "type"="null", "database.user"="debezium",, "database.port"="3306",, "include.schema.changes"="true",, "database.server.name"="fullfillment",, "connector.class"="io.debezium.connector.mysql.MySqlConnector",, "tasks.max"="1",, "database.history.kafka.topic"="dbhistory.fullfillment",, "database.server.id"="1",, "database.whitelist"="inventory",, "name"="inventory-connector",, "database.hostname"="127.0.0.1",, {=, "database.password"="Password#123",, }=, "database.history.kafka.bootstrap.servers"="localhost:9092",} contains no connector type
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:259)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107)
Any help will be highly appreciated.

connector.properites for standalone mode requires property file format. So please take config section and rewrite it like
connector.class=io.debezium.connector.mysql.MySqlConnector
.
.
.

You have a JSON file, not a property file.
This is meant to be used with connect-distributed mode. And POSTed via HTTP to the Kafka Connect REST API, not as a CLI argument.
For connect-standalone, you provide both the Connect worker properties and the connector properties files at the same time, as Java .properties files.

connector.properties files format should be yml,not json.

Related

Unable to trigger Custom Producer Interceptor via JDBC Source Connector

I have created a custom Producer Interceptor (AuditProducerInterceptor) which accepts some custom configs(application_id, type etc.). I have generated a jar from the AuditProducerInterceptor project and placed the jar inside kafka-connect at /usr/share/java/monitoring-interceptors. When i try to post JDBC-Source connector with below configurations, my audit interceptor is not triggered.
{
"name": "jdbc-source-xx-xxxx-xxx-xxx",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"connection.url": "jdbc:sqlserver://{{ip}}:1433;databaseName=XX;useNTLMv2=true",
"connection.user": "SA",
"connection.password": "Admin1234",
"producer.interceptor.classes": "com.optum.payer.common.kafka.audit.interceptor.AuditProducerInterceptor",
"topic.prefix": "MyTestTopic",
"query": "SELECT ID, chart_id, request_id, UpdatedDate FROM xxx.xxx WITH (NOLOCK)",
"mode": "timestamp",
"timestamp.column.name": "UpdatedDate",
"producer.audit.application.id": "HelloApplication",
"producer.audit.type": "test type",
"poll.interval.ms": "10",
"tasks.max": "1",
"batch.max.rows": "100",
"validate.non.null": "false",
"numeric.mapping":"best_fit",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://{{ip}}:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://{{ip}}:8081"
}}
As u can see in the configuration, i have added below props in connector config to trigger the custom interceptor. But I dont see any logs in Kafka-Connect related to AuditProducerInterceptor.
"producer.interceptor.classes": "com.optum.payer.common.kafka.audit.interceptor.AuditProducerInterceptor"
"producer.audit.application.id": "HelloApplication",
"producer.audit.type": "test type"
I tried adding these three config in kafka-connect config and I am able to trigger the interceptor. But I want to trigger the interceptor via JDBC source connector so that i can pass the custom props(application_id,type etc) via connector.
Please help me solve this issue
If you have allowed client overrides (enabled by default) in the Connect worker, you'll want to use producer.override prefix
From docs
Starting with 2.3.0, client configuration overrides can be configured individually per connector by using the prefixes producer.override. and consumer.override. for Kafka sources or Kafka sinks respectively.

Debezium Postgres Connector "After applying the include/exclude list filters, no changes will be captured"

I am using Debezium (a Kafka Connector) to capture Postgres database changes and I am getting an error from Debezium. Does anyone know what the error below means and perhaps offer a suggestion to fix it.
A bit more debugging info:
I tried both "schema.include.list": "banking" and "database.include.list": "banking"... neither works
I tried debezium/connect:1.4 and it works... but not with debezium/connect:1.5+ (1.9 is as high a version as is available and it does not work (same error as below)
Postgres|dbserver1|snapshot After applying the include/exclude list filters, no changes will be captured. Please check your configuration! [io.debezium.relational.RelationalDatabaseSchema]
I have verified (in the logs) that Kafka (and schema registry etc) is running properly, and the Debezium connector seems to have started, and Postgres iw working properly and the database and tables are created.
Below is the Debezium connector configuration:
{
"name": "banking-postgres-connector",
"config": {
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"database.hostname": "postgres",
"database.port": "5432",
"database.user": "postgres",
"database.password": "postgres",
"database.dbname" : "banking",
"database.server.name": "dbserver1",
"database.include.list": "banking",
"tasks.max": "1",
"table.include.list": "public.x_account,public.x_party,public.x_product,public.x_transaction"
}
}
After many hours debugging (plus some advice from #OneCricketeer which pointed me in the right direction), I managed to get a configuration that works with debezium/connect:1.9. The solution was to use defaults by eliminating configuration items:
database.include.list
schema.include.list
This final working Debezium configuration is as follows:
{
"name": "banking-postgres-connector",
"config": {
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"database.hostname": "postgres",
"database.port": "5432",
"database.user": "postgres",
"database.password": "postgres",
"database.dbname" : "banking",
"database.server.name": "dbserver1",
"tasks.max": "1"
}
}
This does point to a minor gap in the Debezium documentation and code:
The documentation should provide valid values (examples) for either "schema.include.list" or "database.include.list" since adding the database name alone for either value does not seem to work for postgres...
It would be great to get more information from the logs... the warning message was both hard to find/understand and left one with little recourse. Since this is a situation where no data is captured, it may warrant a higher importance (log an error?)
NOTE: I offer the above as humble suggestions because I find Debezium to be an OUTSTANDING product!
Had a similar issue myself. Structure of the database server when viewed in pgadmin is
Servers
Server Name (Local)
Databases (9)
database_1
...
Schemas(1)
public
...
Tables (56)
table_1
...
Have configs values set
debezium.source.database.dbname = database_1
debezium.source.schema.include.list=public
Then tried the below seperately with
debezium.source.table.include.list=table_1
debezium.source.table.include.list=public.table_1
But with no success, same warning log.

No CDC generated by Kafka Debezium connector for Postgres

I succeed generating CDC in a Postgres DB.
Today, when I use same step to try to set up Kafka Debezium connector for another Postgres DB.
First I ran
POST http://localhost:8083/connectors
with body:
{
"name": "postgres-kafkaconnector",
"config": {
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"tasks.max": "1",
"database.hostname": "example.com",
"database.port": "5432",
"database.dbname": "my_db",
"database.user": "xxx",
"database.password": "xxx",
"database.server.name": "postgres_server",
"table.include.list": "public.products",
"plugin.name": "pgoutput"
}
}
which succeed without error.
Then I ran
GET http://localhost:8083/connectors/postgres-kafkaconnector/status
to check status. It returns this result without any error:
{
"name": "postgres-kafkaconnector",
"connector": {
"state": "RUNNING",
"worker_id": "10.xx.xx.xx:8083"
},
"tasks": [
{
"id": 0,
"state": "RUNNING",
"worker_id": "10.xx.xx.xx:8083"
}
],
"type": "source"
}
However, this time, when I updated anything in the products table. No CDC got generated.
Any idea? Any suggestion for helping further debug would be appreciate. Thanks!
Found the issue! It is because my Kafka Connector postgres-kafkaconnector was initially pointing to a DB (stage1), then I switched to another DB (stage2) by updating
"database.hostname": "example.com",
"database.port": "5432",
"database.dbname": "my_db",
"database.user": "xxx",
"database.password": "xxx",
However, they are using same configuration properties in the Kafka Connect I deployed in the very beginning:
config.storage.topic
offset.storage.topic
status.storage.topic
Since this connector with different DB config shared same above Kafka configuration properties, nd the database table schemas are same,
it became mess due to sharing same Kafka offset.
One simple way to fix is when deploying Kafka connector to test on different DBs, using different names such as postgres-kafkaconnector-stage1 and postgres-kafkaconnector-stage2 to avoid Kafka topic offset mess.

Kafka Connect: streaming changes from Postgres to topics using debezium

I'm pretty new to Kafka and Kafka Connect world. I am trying to implement CDC using Kafka (on MSK), Kafka Connect (using the Debezium connector for PostgreSQL) and an RDS Postgres instance. Kafka Connect runs in a K8 pod in our cluster deployed in AWS.
Before diving into the details of the configuration used, I'll try to summarise the problem:
Once the connector starts, it sends messages to the topic as expected (snahpshot)
Once we make any change to a table (Create, Update, Delete), no messages are sent to the topic. We would expect to see messages about the changes made to the table.
My connector config looks like:
{
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"database.user": "root",
"database.dbname": "insights",
"slot.name": "cdc_organization",
"tasks.max": "1",
"column.blacklist": "password, access_key, reset_token",
"database.server.name": "insights",
"database.port": "5432",
"plugin.name": "wal2json_rds_streaming",
"schema.whitelist": "public",
"table.whitelist": "public.kafka_connect_cdc_test",
"key.converter.schemas.enable": "false",
"database.hostname": "de-test-sre-12373.cbplqnioxomr.eu-west-1.rds.amazonaws.com",
"database.password": "MYSECRETPWD",
"value.converter.schemas.enable": "false",
"name": "source-postgres",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"snapshot.mode": "initial"
}
We have tried different configurations for the plugin.name property: wal2josn, wal2json_streaming and wal2json_rds_streaming.
There's no problem of connection between the connector and the DB as we already saw messages flowing through as soon as the connector starts.
Is there a configuration issue with the connector described above that prevent us to see messages related to new changes appearing in the topic?
Thanks
Your connector config looks a bit confusing. I'm pretty new to Kafka as well so I don't really know the issue but this is my connector config that works for me.
{
"name":"<connector_name>",
"config": {
"connector.class":"io.debezium.connector.postgresql.PostgresConnector",
"database.server.name":"<server>",
"database.port":"5432",
"database.hostname":"<host>",
"database.user":"<user>",
"database.dbname":"<password>",
"tasks.max":"1",
"database.history.kafka.boostrap.servers":"localhost:9092",
"database.history.kafka.topic":"<kafka_topic_name>",
"plugin.name":"pgoutput",
"include.schema.changes":"true"
}
}
If this configuration didn't work aswell, try look up the log console; sometimes the error isn't the last write of the console

Kafka can not stream database activity

I'm streaming my PostgreSQL using confluent (./kafka-avro-console-consumer). Successfully streaming, but Kafka only showed INSERT activity. Other than that like DELETE, UPDATE, CREATE TABLE, didn't stream on my Kafka consume.
I did:
Install PostgreSQL including making role until making tables
Install debezium CDC for PostgreSQL (I didn't use json/jdbc at all)
Install wal2json
Making connector to confluent
Automatically: topics are made after successfully deploy the connector
Stream topics ./kafka-avro-console-consumer --bootstrap-server localhost:9092 --topic debezium.public.emp_bio --from-beginning
This is my connector
{
"name": "postgres-connector-1",
"config": {
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"tasks.max": "1",
"database.hostname": "localhost",
"database.port": "5432",
"database.user": "dbuser1",
"database.password": "password",
"database.dbname":"testdb",
"database.server.name": "debezium",
"database.whitelist": "testdb",
"plugin.name": "wal2json",
"database.history.kafka.bootstrap.servers": "localhost:9092",
"database.history.kafka.topic": "postgres-hist-test",
"include.schema.changes": "true"
}
}
Expected: I can stream other activities like DELETE, DROP, UPDATE, CREATE from my PostgreSQL
Error: Sorry, there is no error message anywhere!