I have installed Apache Kafka to a Windows system and have tried a console Producer-Consumer example. In case I add new consumers to the topic, only the messages after adding the consumer are printed to the console. Is there any way to get all the messages of the particular topic?
You need to add --from-beginning flag to your console consumer command to get all messages.
Example command:
bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning
Related
How to list all the messages from producer in consumer from beginning in kafka (Windows)?
Not able to display message from producer from beginning in kafka-windows
Console producer does not read data. Use the consumer instead
Try this command in windows powershell:
.\bin\windows\kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic cities --from-beginning
I am running Kafka in windows.
I am creating a Kafka console producer with below command
C:\kafka_2.11-2.4.0\bin\windows>kafka-console-producer.bat --broker-list localhost:9092 --topic input-topic
and creating kafka console consumer with below command
C:\kafka_2.11-2.4.0\bin\windows>kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic input-topic
the problem is i am getting the messages continuously. yesterday it was showing only the messages which i am typing to the console producer.
how to stop auto sending messages?
I need to send the messages if I am typing otherwise it should not send blank messages.
By default, console consumer only gets latest offsets of the topic (only new messages)
If you want to read existing data, you must add --from-beginning
If you want to preserve offsets between runs, you need a --group
Need help in the below steps of how to consume the messages from Kafka topic and store them in a directory /tmp/kakfa-messages
Problem statement :
Create a kafka consumer to consume messages from topic 'Multibrokerapplication' and store them in '/tmp/kafka-messages'
Step 1: I'm able to consume the messages published to Topic 'Multibrokerapplication' as given below .
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic Multibrokerapplication —from-beginning
But, how to achieve the 2nd step of storing them in the folder /tmp/kafka-messages via command lines ?
Could you please suggest ?
Thanks
Output redirection
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic Multibrokerapplication —from-beginning >> /tmp/kafka-messages
But I would suggest you try using connect-standalone command with FileSinkConnector class
I have kafka topics, server name,port, and also brokers list.
How can i see the data from consumer with these details..
Try running this from your kafka directory:
bin/kafka-console-consumer.sh --bootstrap-server <BROKER_IP>:<PORT> --topic <TOPIC_NAME> --from-beginning
It's the quickest way to test consuming from a topic.
I've spent some hours to figure out what was going on but didn't manage to find the solution.
Here is my set up on a single machine:
1 zookeeper running
3 broker running (on port 9092/9093/9094)
1 topic with 3 partitions and 3 replications (each partition are properly assigned between brokers)
I'm using kafka console producer to insert messages. If i check the replication offset (cat replication-offset-checkpoint), I see that my messages are properly ingested by Kafka.
Now I use the kafka console consumer (new):
sudo bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --from-beginning --topic testTopicPartitionned2
I dont see anything consumed. I tried to delete my logs folder (/tmp/kafka-logs-[1,2,3]), create new topics, still nothing.
However when I use the old kafka consumer:
sudo bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic testTopicPartitionned2
I can see my messages.
Am I missing something big here to make this new consumer work ?
Thanks in advance.
Check to see what setting the consumer is using for auto.offset.reset property
This will affect what a consumer group without a previously committed offset will do in terms of setting where to start reading messages from a partition.
Check the Kafka docs for more on this.
Try providing all your brokers to --bootstrap-server argument to see if you notice any differnce:
sudo bin/kafka-console-consumer.sh --bootstrap-server localhost:9092,localhost:9093,localhost:9094 --from-beginning --topic testTopicPartitionned2
Also, your topic name is rather long. I assume you've already made sure you provide the correct topic name.