sending kafka mertics via Opentelemetry - apache-kafka

I have a spring boot application whoch produces data and consumes data from Kafka.
I am trying to send Kafka metrics to opentelemetry collector, so that it can be visualised via Grafana.
Does spring boot support sending kafka metrics?

You need spring-kafka or kafka-clients to get Kafka metrics, not just Boot.
Outside of using opentelemetry-javaagent.jar, you can use Spring Actuator to expose Kafka client metrics (and more) to a Prometheus scape target, which can then be visualized in Grafana.

Related

How to display kafka topic message using kafka connect to TSDB database in Prometheus

I want to monitor kafka topic message in prometheus. I will be using kafka connect for the same, but I want to understand how to get the message content details in prometheus tsdb.
You would need to use PushGateway since Prometheus scrapes endpoints, and has no Kafka consumer API. Similarly, Kafka Connect Sinks don't tend to populate any internal metrics server, other than their native JMX server. In other words, JMX Exporter won't let you "see Kafka messages".
There are HTTP Kafka Connect sinks that exist, and you should try using them to send data to PushGateway API.
However, this is a poor use-case for Prometheus.
InfluxDB, for example, can be populated by a Telegraf Kafka consumer (or Kafka Connect), or Elasticsearch by LogStash (or Kafka Connect).
Then Grafana can use Influx or Elasticseach as a datasource to view events... Or it can also use a regular RDBMS

How to consume message from a topic in Prometheus

I am working on Kafka --> Prometheus --> Grafana pipeline. I have java application which send message inside a kafka topic. But in prometheus it shows only the message count of topic. I am running an instance of JMX Exporter when I run Kafka.
export JMX_YAML=/home/kafka_2.12-2.3.0/prometheus/kafka-0-8-2.yml
export JMX_JAR=/home/kafka_2.12-2.3.0/prometheus/jmx_prometheus_javaagent-0.6.jar
export KAFKA_OPTS="$KAFKA_OPTS -javaagent:$JMX_JAR=7076:$JMX_YAML"
bin/kafka-server-start.sh config/server.properties
But I need to read the topic data in prometheus. Is there any direct Kafka to Prometheus importer?
I have heard about "Kafka Connect framework"? How to configure it inside prometheus?
Prometheus doesn't run Kafka Connect; you would have to configure that separately.
Also, Prometheus is pulled based, so you at the very least would have to use PushGateway, assuming a Kafka Connector did exist.
If you just want to ultimately display data in Grafana, there are existing connectors for Elasticsearch, Influx, Cassandra, and most JDBC databases
Telegraf or Logstash could be used as alternatives to Kafka Connect, as well, or you can write your own consumer.

How to collect JMX metrics from metrics endpoint of Kafka using Telegraf input plugin

I am using TICK stack.
I have to import data from kafka metrics endpoint to Influx DB. Can I do it without integrate jolokia telegraf plugin. I have all instances running in k8s.
Is there a way too use metrics endpoint and put data to influx DB?
Kafka exposes its metrics using JMX, which you can send to InfluxDB with jxmterm.

Is there a way to get kafka streams and camel metrics in telegraf using micrometer

I have Kafka streams and camel in my application and I want to fetch some metrics from it and send it to influx via telegraf. For all the other metrics in my application we are using micrometer. Is there a way to fetch camel routes metrics and kafka stream metrics using micrometer?
For Apache Camel metrics there is the Micrometer component which lives in the Apache Camel repo but at the time of writing there is no out-of-the-box support for Kafka Streams metrics yet although there is a Micrometer issue for the Kafka which targets for 1.1.0-rc.1.

What's the difference between Spring Cloud Bus and Spring for Apache Kafka?

Using Spring for Apache Kafka, or Spring AMQP, I can achieve message pub/sub. Spring Cloud Bus uses kafka/rabbitmq to do the approximately same things, what's the differencce between them?
Spring Cloud Bus is an abstraction built on top of Spring Cloud Stream (and hence kafka and rabbitmq). It is not general purpose, but is built for sending administrative commands to multiple nodes of a service at once. For example, sending a refresh (from spring cloud commons) to all nodes of the user service. There is only one channel, where in spring cloud stream there are many. Think of it as distributed spring boot actuator.