How to make fanout in Apache Kafka? - apache-kafka

I need to send message for all consumers, but before detect who should get this message, how to do that using Kafka?
Should I use Kafks stream to filter data then send to consumers?
As I know each consumers should be added to unique consumer group, but how to detect in real time, who must receive message ?

Kafka decouples consumer and producer and when you write into a topic, you don't know which consumers might read the data.
Thus, in Kafka you never "send a message to a consumer", you just write the message into a topic and that's it.
Consumers just read from topics.

Related

Kafka Consumer writing to the same queue from where it is reading

I have a use case where I want to consume from a kafka topic and depending on some logic if I am not able to process the message right now, I want to enqueue the message back to the same topic from where it had been read
Something like this
Topic1 ---> Consumer ---> Can't process now
^
|Re-enqueues________|
Is it possible ?
Yes, this is possible.
However, be aware that depending on your retention settings the re-ingested message might exist in the topic multiple times. Also, the consumer will consume all messages as long as it is running which could lead to the case that it has consumed all valid messages but keeps on re-ingesting the other messages over and over again.
The typical pattern to deal with messages that should be re-ingested into your pipeline is to send them to a dedicated Kafka topic. Once your consumer is fixed to be able to process those messages you can then have your consumer read that dedicated topic just once.

Removing one message from a topic in Kafka

I'm new at using Kafka and I have one question. Can I delete only ONE message from a topic if I know the topic, the offset and the partition? And if not is there any alternative?
It is not possible to remove a single message from a Kafka topic, even though you know its partition and offset.
Keep in mind, that Kafka is not a key/value store but a topic is rather an append-only(!) log that represents a stream of data.
If you are looking for alternatives to remove a single message you may
Have your consumer clients ignore that message
Enable log compaction and send a tompstone message
Write a simple job (KafkaStreams) to consume the data, filter out that one message and produce all messages to a new topic.

How do I archive Kafka message?

How can be archive Kafka messages like if we want to send a particular message to some topic so we archive that message and send to that topic or some other topic?
Can we replay that message to the topic?
Can we replay based on particular offset?
send a particular message to some topic
That is a regular producer send request
we archive that message
Kafka persists data for a configurable amount of time on its own. Default is a week
send to that topic or some other topic
Again, a producer request can send to a specific topic. Kafka Streams or MirrorMaker can send to other topics, if needed
replay that message to the topic
Not clear... Replay from where? Generally, read the message and produce to a topic
replay based on particular offset
Yes, you can consume from a given TopicPartition + Offset coordinate within Kafka

Can we listen to specific keyed messages by Kafka Consumer

I am very much new to Kafka. I have a requirement, where I need to read only some specific messages.
For this, I will push a message to Kafka with some key and value. Is there a way we can subscribe a consumer to specific list of keys. So that when a message pushed to Kafka, the consumer would consume if the message has a key for which the consumer is listening.
The only way I can think of is that you can assign your consumer to partitions to which a publisher sends messages with keys you're interested in. You'd also need to write your own partitioner, that will allow you to send a message with a key X always to the same partition.
The easier way: write a consumer in a way that will process only messages you want.

Can i consume based on specific condition in Kafka?

I'm writing a msg in to Kafka and consuming it in the other end.
Doing some process in it and writing it back to another Kafka topic.
I want to know which message response is for which request..
currently decided to capture the offset id from consumer side then write in the response and read the response payload and decide the same.
For this approach we need to read each message is there any other way we can consume based on consumer config condition?
Consumers can only read the whole topic. You can only skip messages via seek() but there is no conditions that you can evaluate on the broker to filter messages.
You will need to consume the whole topic an process/filter in the client.