Is it possible to send a message from a Java producer into a Kafka topic and consume the same message from same topic through a python consumer?
I'm only able to produce and consume data from python but I want producer data from java in Kafka topic and want to consume data through python consumer.
Yes, sending a message from any language producer to a Kafka topic is possible, and then consuming the same message from the same topic using any language consumer. In your case Java and python respectively.
I want producer data from java
That's exactly what bin/kafka-console-producer built-in script does. It is a shell command wrapped around Java code.
If you can consume those records from your existing Python consumer, that's exactly what you are asking for.
Related
I have to publish messages from a kafka topic to lambda to process them and store in a database using a springboot application, i did some research and found something to consume messages from kafka
public Function<KStream<String, String>, KStream<String, String>> process(){} however, im not sure if this is only used to publish the consumed messages to another kafka topic or can be used as an event source to lambda, I need some guidance on consuming and converting the consumed kafka message to event source.
Brokers do not push. Consumers always poll.
Code shown is for Kafka Streams API, which primarily writes to new Kafka topics. While you could fire HTTP events to start a lambda, that's not recommended.
Alternatively, Kafka is already supported as an event source. You don't need to write any consumer code.
https://aws.amazon.com/about-aws/whats-new/2020/12/aws-lambda-now-supports-self-managed-apache-kafka-as-an-event-source/
This is possible from MSK or a self managed Kafka
process them and store in a database
Your lambda could process the data and send to a new Kafka topic using a producer. You can then use MSK Connect or run your own Kafka Connect cluster elsewhere to dump records into a database. No Spring/Java code would be necessary.
I have an ActiveMQ Artemis JMS queue and there is a Kafka Source Connector. I want to write message from this queue to multiple topics parallel. I found that the Simple Message Transform could be the solution. I tried to configure RegexRouter, but I can only change the name of the topic?
I tried to create 3 connector instances but only one can receive message. I guess because message is deleted from queue at the first read.
How to list producers writing to a certain kafka topic using kafka CLI ?
There is no command line tool available that is able to list all producers for a certain topic.
This would require that in Kafka there is a central place where all producers and their metadata are being stored which is not the case (as opposed to consumers and their ConsumerGroups).
What is the difference between KafkaSpout and KafkaBolt object ? Usually KafkaSpout is used for reading data from kafka producers, but why we use KafkaBolt ?
Bolts in Storm write data. Spouts are Kafka consumers. You read from the broker directly, not from producers.
For example, you can use a Spout to read anything, transform that data within the topology, then setup a Kafka Bolt to produce data into Kafka
I am using Kafka 2 and looks like exactly once is possible with
Kafka Streams
Kafka read/transform/write transactional producer
Kafka connect
Here, all of the above works between topics (source and destination is topic).
Is it possible to have exactly once with other destinations?
Source and destinations (sinks) of Connect are not only topics, but which Connector you use determines the delivery semantics, not all are exactly once
For example, a JDBC Source Connector polling a database might miss some records
Sink Connectors coming out of Kafka will send every message from a topic, but it's up to the downstream system to acknowledge that retrieval