A UI console to check the depth of ibm message hub - ibm-cloud

I have connected a producer to IBM Message Hub in Bluemix. How could I get a view of topic and its depth in message hub. Is there a web console where i can see messages count?
Thanks
Raj

There isn't really the concept of a topic depth in Kafka, as Kafka is a commit log. You're not reading messages off a queue, you're reading messages from a point in the log, you can specify where in the log you start, and reading a message doesn't remove it from the log. So, the number of messages available is not affected by a read operation, but an individual consumer's place on the log moves.
Partitions per topic can be found from the Message Hub UI that lists the topics names, as well as the retention policy for the topic.
Input/Output rates for each topic are available in the Grafana tool.

Related

Kafka consumer stopped consuming topics from third party system

My consumers stopped consuming topics from a third party system. But They work with internal topics. The topics from the third party system appear in the kafka web view, but are not consumed.
Skipping fetch for partition because previous request to some-cluster has not been processed
I did some research and increased the heartbeat and max-poll-records, without success.
See: Kafka consumer does not fetch new records when using topic pattern and large messages
and
Kafka Consumer stopped consuming messages from topic. We are using SmallRye Reactive Messaging connector to fetch records
How can I further debug or fix this problem
How can I further debug this problem?
I would take a look at the lowest level, i.e. network connectivity provided by org.apache.kafka.clients.NetworkClient. Logging that on trace level is going to show you outbound requests and received responses (or the timeouts received). This should help you identify whether it's something on your side or some kind of backend mis-configuration.

Kafka consumer group file not found at default location

It is observerd that when data is sending through API from an external user to this real time portal, he is getting success response and when we are trying to see the data from reports, no data is found. I am trying to identify this issue.
In the infra, there is a Kafka Server with one broker only. When I try to see the list of consumers & producers, I didn't find the file of consumer group. Can anyone suggest where to search for that or any other suggestion.
Consumer groups are stored in the __consumer_offsets topic.
Topics have files in the Kafka data location, but it's not directly readable from there
There's nothing built in to Kafka that'll allow you to see active producers.

Kafka - redriving events in an error topic

We've implemented some resilience in our kafka consumer by having a main topic, a retry topic and an error topic as outlined in this blog.
I'm wondering what patterns teams are using out there to redrive events in the error topic back into the retry topic for reprocessing. Do you use some kind of GUI to help do this redrive? I foresee a need to potentially append all events from the error topic into the retry topic, but also to selectively skip certain events in the error topic if they can't be reprocessed.
Two patterns I've seen
redeploy the app with a new topic config (via environment variables or other external config).
Or use a scheduled task within the code that checks the upstream DLQ topic(s)
If you want to use a GUI, that's fine, but seems like more work for little gain as there's no tooling already built around that

configure and publish data to NifiKafkaConsumer [duplicate]

I have started a Nifi process(Consume Kafka) and connected it to a topic. It is running but I am not able to (don't know) where can I view the messages?
ConsumeKafka processor runs and generates flowfile for each message. Only when you connect a processor to other components like another processor or an output port, will you be able to visualize the data being moved through.
For starters you can try this:
Connect ConsumeKafka with LogAttribute or any other processor for
that matter.
Stop or disable the LogAttribute processor.
Now when
you start ConsumeKafka, all the received messages from the
configured Kafka topic will be queued up in the form of flowfiles.
Right click that relationship where the flowfiles are queued up and
click List Queue and you can access the queue.
Click any item on
the queue, a context menu will come up. Click View button and you
can see the data.
This whole explanation of "viewing" the Kafka message is just to help you in debugging and get started with NiFi. Ideally you would be using other NiFi processors to work out your usecase.
Example
You receive messages from Kafka and wants to write it to MongoDB, so you can have the flow as:
Note:
There are record based processors like ConsumeKafkaRecord and PutMongoRecord but they are basically doing the same thing with more enhancements. Since you're new to this, I have suggested a simple flow. You can find details about the Record based processors here and try that.
You might need to consume messages --from-beginning if those messages have been consumed before (and therefore offsets have been committed).
On GetKafka processor, there is a property Auto Offset Reset which should be set to smallest which is the equivalent of --from-beginning in Kafka Console Consumer.

How do I view the consumed messages of Kafka in Nifi?

I have started a Nifi process(Consume Kafka) and connected it to a topic. It is running but I am not able to (don't know) where can I view the messages?
ConsumeKafka processor runs and generates flowfile for each message. Only when you connect a processor to other components like another processor or an output port, will you be able to visualize the data being moved through.
For starters you can try this:
Connect ConsumeKafka with LogAttribute or any other processor for
that matter.
Stop or disable the LogAttribute processor.
Now when
you start ConsumeKafka, all the received messages from the
configured Kafka topic will be queued up in the form of flowfiles.
Right click that relationship where the flowfiles are queued up and
click List Queue and you can access the queue.
Click any item on
the queue, a context menu will come up. Click View button and you
can see the data.
This whole explanation of "viewing" the Kafka message is just to help you in debugging and get started with NiFi. Ideally you would be using other NiFi processors to work out your usecase.
Example
You receive messages from Kafka and wants to write it to MongoDB, so you can have the flow as:
Note:
There are record based processors like ConsumeKafkaRecord and PutMongoRecord but they are basically doing the same thing with more enhancements. Since you're new to this, I have suggested a simple flow. You can find details about the Record based processors here and try that.
You might need to consume messages --from-beginning if those messages have been consumed before (and therefore offsets have been committed).
On GetKafka processor, there is a property Auto Offset Reset which should be set to smallest which is the equivalent of --from-beginning in Kafka Console Consumer.