What's the difference between Spring Cloud Bus and Spring for Apache Kafka? - spring-cloud

Using Spring for Apache Kafka, or Spring AMQP, I can achieve message pub/sub. Spring Cloud Bus uses kafka/rabbitmq to do the approximately same things, what's the differencce between them?

Spring Cloud Bus is an abstraction built on top of Spring Cloud Stream (and hence kafka and rabbitmq). It is not general purpose, but is built for sending administrative commands to multiple nodes of a service at once. For example, sending a refresh (from spring cloud commons) to all nodes of the user service. There is only one channel, where in spring cloud stream there are many. Think of it as distributed spring boot actuator.

Related

sending kafka mertics via Opentelemetry

I have a spring boot application whoch produces data and consumes data from Kafka.
I am trying to send Kafka metrics to opentelemetry collector, so that it can be visualised via Grafana.
Does spring boot support sending kafka metrics?
You need spring-kafka or kafka-clients to get Kafka metrics, not just Boot.
Outside of using opentelemetry-javaagent.jar, you can use Spring Actuator to expose Kafka client metrics (and more) to a Prometheus scape target, which can then be visualized in Grafana.

Spring Cloud Bus/Stream Issues in Spring Cloud 2020.0.0

We have a Spring Boot Microservice that as well as having HTTP endpoints uses Spring Cloud Bus to pick up refresh events (from rabbit) and also has a Spring Cloud Stream Sink that picks up custom messages from another rabbit topic.
After updating to Spring Boot 2.4.1 and Spring Cloud 2020.0.0 everything seemed to be working until we discovered Spring Cloud Bus was no longer picking up events.
Looking into this it turned out some of the Spring Cloud Bus internal channels where not getting created.
This wasn't happening in another service that didn't have the stream functionality as well so we tested disabling that and the bus functionality then started working.
So it was obviously some sort of interference between the old style stream model and the newer Spring Cloud Bus.
After updating the our sink to use the new function model I still had issues and eventually got both to work by including the following lines in our application.yml:
spring:
cloud:
stream:
bindings.mySink-in-0.destination: mytopic
function.definition: busConsumer;mySink
So I have the following questions
Did I miss something or should there be better documentation on how stream / bus can affect each other and the migration to 2020.0.0?
Does my current configuration look correct?
It doesn't seem right to have to include busConsumer here - should the auto configuration for it not be able to 'combine it in' with any other stream config?
What's the difference between spring.cloud.stream.function.definition and spring.cloud.function.definition? I've seen both in documentation and Spring Cloud Bus seems to be also setting spring.cloud.function.definition=busConsumer
In org.springframework.cloud.stream.function.FunctionConfiguration, It does a search for #EnableBinding.
if (ObjectUtils.isEmpty(applicationContext.getBeanNamesForAnnotation(EnableBinding.class)))
If found, functional binding is disabled. See this
logger.info("Functional binding is disabled due to the presense of #EnableBinding annotation in your configuration");
After the upgrade, we need to transform our Listener classes to use functional interface in order to activate the functional binding. After that, cloud bus consumer binding will be created too.

Use Spring Cloud Function and Spring Cloud Stream together?

Has anyone tried using both Spring Cloud Function and Spring Cloud Stream together? Is there any reason this shouldn't be done? We currently use Spring Cloud Function but there are certain cases where we need to have synchronous Kafka publishing and it seems as if the only way to do that is with Spring Cloud Stream.
Thanks, Anne

KAFKA Producer API Vs JMS Producer API

High level Design of application :
Upstream system sends stream of data, data is received by Java Application. Using KAFKA as data store, logstash will publish stored data in Elastic index, and all the application will use elastic search query to get the data.
Problem : To publish data from Java application to KAFKA, which API Kafka JMS client or Java Kafka Producer/Consumer API should be used?
As per kafka documentation, If you are interested in writing new Java applications then you are encouraged to use the Java Kafka Producer/Consumer APIs as they provide advanced features not available when using the kafka-jms-client https://docs.confluent.io/current/clients/kafka-jms-client/docs/index.html .
Also as per KAFKA documentation it is not typical Messgaing broker and not all JMS concepts map 1:1 kafka.
Is there any benefit of using JMS API for KAFKA since KAFKA is not typical Messaging broker [and application will be still tightly couple to KAFKA] and not all JMS concepts can be mapped to kafka?

Kafka and Microservices using Producer Consumer

I need to use Apache Kafka in my project of Micro-Services. I need my one micro-service to produce data and another to consume the same data. How can I make Kafka do the same between the two services
I would recommend you to take a look at Spring Cloud Stream as it does exactly what you need.
From docs:
Framework for building message-driven microservices. Spring Cloud Stream builds upon Spring Boot to create DevOps friendly microservice applications and Spring Integration to provide connectivity to message brokers. Spring Cloud Stream provides an opinionated configuration of message brokers, introducing the concepts of persistent pub/sub semantics, consumer groups and partitions across several middleware vendors. This opinionated configuration provides the basis to create stream processing applications.
By adding #EnableBinding to your main application, you get immediate connectivity to a message broker and by adding #StreamListener to a method, you will receive events for stream processing.