I've installed confluent_3.3.0 and started zookeper, schema-registry and kafka broker. I have also downloaded mongodb connector from this link.
Description: I'm running sink connector using the following command:
./bin/connect-standalone etc/kafka/connect-standalone.properties /home/username/mongo-connect-test/kafka-connect-mongodb/quickstart-couchbase-sink.properties
Problem: I'm getting the following error:
ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:91)
java.lang.IllegalAccessError: tried to access field org.apache.kafka.common.config.ConfigDef.NO_DEFAULT_VALUE from class org.radarcns.mongodb.MongoDbSinkConnector
Thanks for reading !
This connector is using, at its latest version, an old version of the kafka-clients API. Specifically, it is depending on a constructor of the class org.apache.kafka.common.config.AbstractConfig that does not exist in Apache Kafka versions >= 0.11.0.0
Confluent Platform version 3.3.0 is using Apache Kafka 0.11.0.0
To fix this issue, the recommended approach would be to update the connector code to use the most recent versions of Apache Kafka APIs.
Related
We are planning to upgrade our existing apache kafka cluster to confluent kafka while upgrading do we have any data loss in the topics?? And also main reason we are upgrading is to use s3 sink connector is there any connector which is available in apache kafka itself?
Unless you want to migrate to Confluent Server, there is nothing you need to migrate; Confluent Platform includes Apache Kafka
Kafka Connect, on the other hand, is a pluggable environment, and doesn't require Confluent tools/systems other than the specific JAR file(s) for the S3 Connector
You can use S3 sink connector from apache camel
https://camel.apache.org/camel-kafka-connector/next/reference/connectors/camel-aws-s3-sink-kafka-sink-connector.html
Just need to download the S3 sink connector jar file from this link:
https://camel.apache.org/camel-kafka-connector/next/reference/index.html
Copy the jar file in connector plugins path. It depends on value of your properties. By default on the relative path: plugins/connectors, or set in plugin.path property.
So you don't need to restart and lost any data.
What is the difference between using Apache Kafka Connect API and Confluent Kafka Connect API ?
As mike mentioned, there is no core difference between Apache kafka connect and confluent kafka connect.
As an example of using JDBC Connector plugin from Confluent with MySQL database to read data from MySQL and send it to kafka topic.
For a quick demo on your laptop, follow below main steps:
Download Apache Kafka (from either Apache site, or you can download Confluent Platform)
Run single Kafka broker using kafka-server-start script.
Download kafka-connect-jdbc from Confluent Hub
Edit plugin.path in connect-standalone.properties to include the full path to extracted kafka-connect-jdbc files.
Download and copy mysql driver into kafka-connect-jdbc folder with other JARs (you should see sqlite JAR is already there)
create jdbc source connector configuration file
run Kafka connect in standalone mode with jdbc source connector configuration
$KAFKA_HOME/bin/connect-standalone.sh ../config/connect-standalone.properties ../config/jdbc-connector-config.properties
Useful links
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/
https://docs.confluent.io/current/connect/kafka-connect-jdbc/index.html
If you want to write code, then learn kafka producer api usage.
https://docs.confluent.io/current/clients/producer.html
https://kafka.apache.org/documentation/#producerapi
I am using latest Greenwich.SR1 which include spring-cloud-stream (version: Fishtown.SR2) and locally starting kafka client version:2.2.0 using kafka_2.12-2.2.0.jar
I want to use the latest kafka client 2.1 or higher with spring-cloud-stream because it contains some important bug fixes. But when I run my spring app, its logs says
INFO 37812 --- [main] o.a.kafka.common.utils.AppInfoParser : Kafka version : 2.0.1
How can I use spring-cloud-stream with the latest kafka client?
I want to use max.task.idle.ms from the latest StreamsConfig of kafka client but it seems the latest spring-cloud-stream kafka stream binder doesn't support kafka client 2.1.0 or higher version?
Spring versioning rules don't allow us to upgrade to a new version of the kafka clients in a point release. Since Fishtown uses Spring for Apache Kafka 2.2, it uses the 2.0.x kafka-clients.
The next version of Spring for Apache Kafka will use the 2.2.0 clients (or 2.3.0 if it's available by then), so the next version of the binder will be based on the newer clients.
The message channel binder works when the kafka-clients is overridden but, unfortunately, the streams binder does not due to some internal API changes.
While starting a topology with Kafka Spout with new Kafka version 2.1.0 and Storm version 1.2.2 getting a java.lang.ClassNotFoundException: kafka.api.OffsetRequest. I don't get this when I use Kafka version 0.10.0.1. Can you guys please help as I want to be on latest Kafka version?
I have tried all the latest kafka version starting with 2.*. But not of it works.Caused by: java.lang.ClassNotFoundException: kafka.api.OffsetRequest
kafka.api is the old Scala classes. Many of these were removed in Kafka 2.x
The majority of those classes were moved to org.apache.kafka.common.requests, and there is ListOffsetRequest and OffsetFetchRequest, so not sure which you're trying to use.
If Storm itself is depending on these old APIs, then you are bound to those, your own processor cannot use the new APIs.
Plus, the Kafka server version itself only supports certain API calls of these new request classes.
Adding to this answer, I suspect you're using the storm-kafka library for Kafka integration. You need to migrate to storm-kafka-client, which is based on the new Kafka APIs. The documentation for the new module can be found here.
If you need to migrate committed offsets from storm-kafka, you can use the utility at https://github.com/apache/storm/tree/master/external/storm-kafka-migration. It will let you migrate without having to start over on your Kafka partitions.
Any Opensource tool to monitor confluent Kafka? Most of the opensource tools available are specific to Apache Kafka but not for Confluent Kafka.
we want to monitor atleast the connectors, streams and cluster health
The Kafka that is distributed in the Confluent Platform is Apache Kafka. There really is no such thing as "Confluent Kafka". Any tools that work with the latest version of Apache Kafka (including Kafka Connect and Kafka Streams) will work with the same versions of Kafka included with Confluent Open Source.
Confluent 3.3 includes Apache Kafka 0.11
Confluent 3.2 includes Apache Kafka 0.10.2
Confluent 3.1 includes Apache Kafka 0.10.1
Confluent 3.0 includes Apache Kafka 0.10.0
Confluent 2.0 includes Apache Kafka 0.9
Confluent 1.0 includes Apache Kafka 0.8.2
Note: Confluent Enterprise includes its own monitoring and management GUI called Control Center. Control Center is a separate process so the Apache Kafka is still the same as the open source version.
You can use updated version of KafkaOffsetMonitor. It supports SSL/TLS and Kerbros. Also uses Kafka 1.1.0 library.
You should be able to use kafka-monitor for monitoring your cluster's health as well as Burrow and KafkaOffsetMonitor for monitoring your consumer application lag. Also, you should definitely use something like jmxtrans for collecting your Kafka broker metrics.