AWS MSK with confluent schema registry - apache-kafka

Is it possible to use AWS MSK with confluent schema registry with docker instance? I dont need any Kafka connector or sink connector. Before publish want to use AVRO with confluent schema registry and deserialize the same during consumption. What are all the properties i need to set on confluent schema-registry docker? When i try to run i get this error java.lang.RuntimeException: No endpoints found for security protocol [PLAINTEXT]. Endpoints found in ZK. Any pointers are greatly appreciated.

Related

Snowflake Kafka Connect JMX configuration to get Snowflake connector metrics

Could someone please help me with a sample JMX config file to get metrics from Kafka connect cluster that is running snowflake kafka connector. I am able to get most of the kafka connect specific metrics, however, couldnt get a solution to extracts metrics for snowflake.kafka.connector MBeab. I have gone through the snowflake and confluent documentation but couldnt find any solution for this issue.
MBean object name:-
snowflake.kafka.connector:connector=connector_name,pipe=pipe_name,category=category_name,name=metric_name

How to run the schema register for Apache Kafka without Confluent env

I am not getting any doc through which I can setup my schema registry for Apache Kafka env only with the help of Confluent. but all I am getting is docs in Confluent env using.
See if this approach makes sense for your usecase.
Apicurio registry
You can host apicurio in Kubernetes and allow client first verify schema before publishing message to kafka.
Some good blogpost outlining on how to use:
https://itnext.io/stream-processing-with-apache-spark-kafka-avro-and-apicurio-registry-on-amazon-emr-and-amazon-13080defa3be
https://www.youtube.com/watch?v=xthbYl7xC74

kafka connect mongo on kafka MSK

I am using Kafka MSK in AWS. So we don't have native kafka connect with all required connectors like on confluent.
Actually I work with kakfa mongo connector and I want to find a way to push the kafka mongo connector jar to an on an instance of kafka MSK cluster.
The path to which the jar will be pushed is the plugins.path as defined in the properties of the used connector.
ANy way to make it please ?
MSK doesn't give you a hosted Kafka Connect worker. You'd need to provision and run this yourself, e.g. on EC2. This work would then connect to your Kafka cluster (MSK in this case)
To be clear: MSK is only the hosted Kafka brokers (and Zookeeper). It does not include Kafka Connect, which is what you need in order to run connectors.

Apache Kafka alongside Confluent Schema Registry

I'm using the Azure HDInsight's managed Apache Kafka solution since unfortunately there's no managed Confluent Kafka solution on Azure. Is it possible to run the Confluent Schema Registry and connect it to the HDInsight Apache Kafka cluster's brokers?
I'm hoping to install just the Schema Registry on a single VM, then using this line in the schema-registry.properties file, point it to the HDInsight cluster's list of brokers:
kafkastore.bootstrap.servers=PLAINTEXT://localhost:9092
Will this work? Or do the brokers need to be Confluent installations and not Apache?
The Apache Kafka brokers in Confluent Platform are Apache Kafka. So yes you can self-host Schema Registry and connect it to Apache Kafka from another distribution.

How to use Spring Cloud Stream to produce & consume Avro messages using Confluent Schema Registry?

I can run example by following Start Streaming with Kafka and Spring Cloud, but unfortunately it doesn't use confluent schema registry. I read the confluent schema registry part of Spring Cloud Stream reference guide, but it didn't work with my confluent 3.0.0 and the guide doesn't mention how to produce Avro message using confluent schema registry. So, can anyone guide me how to achieve it? Thanks!
The Spring Cloud Stream is not yet compatible with Confluent Schema Registry. See discussion in this thread https://github.com/spring-cloud/spring-cloud-stream/issues/850