How can I delete a debezium connector. I am following this tutorial
https://debezium.io/documentation/reference/tutorial.html and I see the way to register a connector but couldn't figure out how to delete / update a connector.
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{ "name": "inventory-connector", "config": { "connector.class": "io.debezium.connector.mysql.MySqlConnector", "tasks.max": "1", "database.hostname": "mysql", "database.port": "3306", "database.user": "debezium", "database.password": "dbz", "database.server.id": "184054", "database.server.name": "dbserver1", "database.include.list": "inventory", "database.history.kafka.bootstrap.servers": "kafka:9092", "database.history.kafka.topic": "dbhistory.inventory" } }'
Can you also please point to me to the documentation page where deleting and updating a connector is mentioned.
debezium connector is a standard connector which you plug in to kafka connet framework. The Kafka Connect framework support several REST commands in order to interact with it.
For delete you submit DELETE request
curl -i -X DELETE localhost:8083/connectors/inventory-connector/
To update configuration you submit PUT request with the new configuration
curl -i -X PUT -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/inventory-connector/config -d '{ "connector.class": "io.debezium.connector.mysql.MySqlConnector", "tasks.max": "1", "database.hostname": "mysql", "database.port": "3306", "database.user": "debezium", "database.password": "dbz", "database.server.id": "184054", "database.server.name": "dbserver1", "database.include.list": "inventory", "database.history.kafka.bootstrap.servers": "kafka:9092", "database.history.kafka.topic": "dbhistory.inventory" }'
Further REST API instructions
https://docs.confluent.io/platform/current/connect/references/restapi.html
Related
Creating the source-connection.
curl -X POST http://localhost:8083/connectors -H "Content-Type: application/json" -d '{
"name": "jdbc_source_mysql_01",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"connection.url": "jdbc:mysql://fulfillmentdbhost:3306/fulfillmentdb",
"connection.user": "fullfilment_user",
"connection.password": "<password>",
"topic.prefix": "order-status-update-",
"mode":"timestamp",
"table.whitelist" : "fulfullmentdb.status",
"timestamp.column.name": "LAST_UPDATED",
"validate.non.null": false
}
}'
Creating the sink-connection.
curl -X POST http://localhost:8083/connectors -H "Content-Type: application/json" -d '{
"name": "jdbc_sink_mysql_01",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"connection.url": "jdbc:mysql://crmdbhost:3306/crmdb",
"connection.user": "crm_user",
"connection.password": "<password>",
"topics": "order-status-update-status",
"table.name.format" : "crmdb.order_status"
}
}'
the connector.class is given from confluent community. But I want it from Apache Kafka
Which is open source
we were searching how we can replace that line with apache kafka one.
There is no JDBC Connector provided by Apache Kafka.
The Confluent one is open source.
There is also one from IBM and Aiven
Confluent (among other companies, as shown) simply write plugins for Kafka Connect, which you need to download/upgrade/install on your own Apache Kafka Connect servers.
I'm following the confluent tutorial to produce messages (Produce and Consume Avro Messages), but when I post in the messages defining the schema registry, it gives the following error and I don't know how to continue, I've looked in several places and none have this error
{
"error_code": 40801,
"message": "Error when registering schema. format = AVRO, subject = teste-value, schema = {\"type\":\"record\",\"name\":\"teste\",\"fields\":[{\"name\":\"name\",\"type\":\"string\"}]}"
}
curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json" \
-H "Accept: application/vnd.kafka.v2+json" \
--data '{"value_schema": "{\"type\": \"record\", \"name\": \"teste\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "teste"}}]}' \
"http://localhost:38082/topics/teste"
Can you try formatting your query like the following and give it a try.
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Payment\",\"namespace\":\"io.confluent.examples.clients.basicavro\",\"fields\":[{\"name\":\"id\",\"type\":\"string\"},{\"name\":\"amount\",\"type\":\"double\"}]}"}' http://localhost:8081/subjects/test-value/versions
I set up my mongoDB cluster with TLS authentication.
I can successfully connect on a mongos instance using :
/opt/cluster/stacks/mongoDB/bin/mongosh --tls --host $(hostname).domain.name -tlsCAFile /opt/cluster/security/ssl/cert.pem --port 27017
I have a Kafka connect mongoDB sink that has the following configuration :
{
"name": "client-order-request-mongodb-sink",
"config": {
"connector.class": "com.mongodb.kafka.connect.MongoSinkConnector",
"database":"Trading",
"collection":"ClientOrderRequest",
"topics":"ClientOrderRequest",
"connection.uri":"mongodb://hostname1.domain.name:27017,pre-hostname2.domain.name:27017",
"mongo.errors.tolerance": "all",
"mongo.errors.log.enable": "true",
"errors.log.include.messages": "true",
"writemodel.strategy":"com.mongodb.kafka.connect.sink.writemodel.strategy.ReplaceOneBusinessKeyStrategy",
"document.id.strategy": "com.mongodb.kafka.connect.sink.processor.id.strategy.PartialValueStrategy",
"document.id.strategy.overwrite.existing": "true",
"document.id.strategy.partial.value.projection.type": "allowlist",
"document.id.strategy.partial.value.projection.list": "localReceiveTime,clientId,orderId"
}
}
It is working fine if I redeploy mongoDB without authentication, but now when I try to instantiate it with the following curl command :
curl -X POST -H "Content-Type: application/json" --data '#connect-task-sink-mongodb-client-order-request.json' $KAFKA_CONNECT_LEADER_NODE/connectors/
I have the following error:
{"error_code":400,"message":"Connector configuration is invalid and contains the following 1 error(s):\nUnable to connect to the server.\nYou can also find the above list of errors at the endpoint /connector-plugins/{connectorType}/config/validate"}
From the mongoDB kafka connect sink documentation I found that I needed to set up global variable of the KAFKA_OPTS so before starting the distributed connect server I do:
export KAFKA_OPTS="\
-Djavax.net.ssl.trustStore=/opt/cluster/security/ssl/keystore.jks \
-Djavax.net.ssl.trustStorePassword=\"\" \
-Djavax.net.ssl.keyStore=/opt/cluster/security/ssl/keystore.jks \
-Djavax.net.ssl.keyStorePassword=\"\""
Notice that I put an empty password because when I list the entry of my keystore with:
keytool -v -list -keystore key.jks
Then I just press enter when the password is prompted.
So the issue was that the ssl connection wasn't enabled on the client side.
If you want to do so with the mongoDB kafka connect plugin you need to state it in the connection.uri config parameter such as:
"connection.uri":"mongodb://hostname1.domain.name:27017,pre-hostname2.domain.name:27017/?ssl=true"
I have a trouble producing to Kafka topic through Rest API proxy.
I have a running confluent kafka cluster in which I'd like to create a topic and produce a message to it using REST API.
For that purpose I have followed the documentation and created API key and secret.
I manage to create topic:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{\"topic_name\":\"test1\",\"partitions_count\":6,\"configs\":[]}" \
https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics" | jq
-------------------
returns:
{
"kind": "KafkaTopic",
"metadata": {
"self": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1",
"resource_name": "crn:///kafka=lkc-1zoqz/topic=test1"
},
"cluster_id": "lkc-1zoqz",
"topic_name": "test1",
"is_internal": false,
"replication_factor": 0,
"partitions_count": 0,
"partitions": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions"
},
"configs": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/configs"
},
"partition_reassignments": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions/-/reassignment"
},
"authorized_operations": []
}
MY PROBLEM:
I can't produce to that topic (can't produce to ANY topic through Kafka Rest API) :
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
--data "{"records":[{"value":"S2Fma2E="}]}" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1"
-----------------
returns:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
ALSO TRIED LIKE THIS:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{ \"key\": {\"value\": \"S2Fma2E=\"} }" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1/records"
----------------
Returns the same:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
IDK if this has something to do with ACL management? Looking into that right now....
Are you using Confluent Platform (hosted) or Confluent Cloud API?
If the latter, this API is not supported, here's the documentation on what's supported by Cloud API: https://docs.confluent.io/cloud/current/api.html
I need to implement Kafka connect, so I have referred the video
https://www.youtube.com/watch?v=r7LUbtOFcQI
and implemented steps accordingly.
While starting the Kafka connect, I need to run the CURL command. while running the below command i am facing an issue as
curl -X POST -H "Content-Type: application/json" --data '{
"name": "jdbc_source_connector",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max":2,
"connection.url": "jdbc:mysql://69.61.32.102:3306/ssptest",
"connection.user": "sspuser",
"connection.password": "xxx",
"topic.prefix": "",
"poll.interval.ms" : 3600000,
"table.whitelist" : "Persons",
"mode":"incrementing"
}
}'http://localhost:8083/connectors
I am facing an error as,
curl: no URL specified
Can anyone please suggest a solution to fix the error