Spring Cloud Bus/Stream Issues in Spring Cloud 2020.0.0 - spring-cloud

We have a Spring Boot Microservice that as well as having HTTP endpoints uses Spring Cloud Bus to pick up refresh events (from rabbit) and also has a Spring Cloud Stream Sink that picks up custom messages from another rabbit topic.
After updating to Spring Boot 2.4.1 and Spring Cloud 2020.0.0 everything seemed to be working until we discovered Spring Cloud Bus was no longer picking up events.
Looking into this it turned out some of the Spring Cloud Bus internal channels where not getting created.
This wasn't happening in another service that didn't have the stream functionality as well so we tested disabling that and the bus functionality then started working.
So it was obviously some sort of interference between the old style stream model and the newer Spring Cloud Bus.
After updating the our sink to use the new function model I still had issues and eventually got both to work by including the following lines in our application.yml:
spring:
cloud:
stream:
bindings.mySink-in-0.destination: mytopic
function.definition: busConsumer;mySink
So I have the following questions
Did I miss something or should there be better documentation on how stream / bus can affect each other and the migration to 2020.0.0?
Does my current configuration look correct?
It doesn't seem right to have to include busConsumer here - should the auto configuration for it not be able to 'combine it in' with any other stream config?
What's the difference between spring.cloud.stream.function.definition and spring.cloud.function.definition? I've seen both in documentation and Spring Cloud Bus seems to be also setting spring.cloud.function.definition=busConsumer

In org.springframework.cloud.stream.function.FunctionConfiguration, It does a search for #EnableBinding.
if (ObjectUtils.isEmpty(applicationContext.getBeanNamesForAnnotation(EnableBinding.class)))
If found, functional binding is disabled. See this
logger.info("Functional binding is disabled due to the presense of #EnableBinding annotation in your configuration");
After the upgrade, we need to transform our Listener classes to use functional interface in order to activate the functional binding. After that, cloud bus consumer binding will be created too.

Related

How to use Spring Cloud Sleuth for tracing Kafka streams

How to trace Kafka based events using Spring Cloud sleuth? Whatever examples I saw were for rest APIs. I am looking for kafka-clients library.
Also is it a good idea to use Spring cloud sleuth for this or should I pass my traceIds via headers manually?
From the Spring Cloud Sleuth documentation here it says that the integration is provided with Kafka Streams ( Sleuth internally uses library Brave for instrumentation). The property through which this can be enabled/disabled is spring.sleuth.messaging.kafka.streams.enabled ( true/false)
We instrument the KafkaStreams KafkaClientSupplier so that tracing
headers get injected into the Producer and Consumer's. A
KafkaStreamsTracing bean allows for further instrumentation through
additional TransformerSupplier and ProcessorSupplier methods.
For an example configuration/code you can take a look here.

What's the difference between Spring Cloud Bus and Spring for Apache Kafka?

Using Spring for Apache Kafka, or Spring AMQP, I can achieve message pub/sub. Spring Cloud Bus uses kafka/rabbitmq to do the approximately same things, what's the differencce between them?
Spring Cloud Bus is an abstraction built on top of Spring Cloud Stream (and hence kafka and rabbitmq). It is not general purpose, but is built for sending administrative commands to multiple nodes of a service at once. For example, sending a refresh (from spring cloud commons) to all nodes of the user service. There is only one channel, where in spring cloud stream there are many. Think of it as distributed spring boot actuator.

Kafka and Microservices using Producer Consumer

I need to use Apache Kafka in my project of Micro-Services. I need my one micro-service to produce data and another to consume the same data. How can I make Kafka do the same between the two services
I would recommend you to take a look at Spring Cloud Stream as it does exactly what you need.
From docs:
Framework for building message-driven microservices. Spring Cloud Stream builds upon Spring Boot to create DevOps friendly microservice applications and Spring Integration to provide connectivity to message brokers. Spring Cloud Stream provides an opinionated configuration of message brokers, introducing the concepts of persistent pub/sub semantics, consumer groups and partitions across several middleware vendors. This opinionated configuration provides the basis to create stream processing applications.
By adding #EnableBinding to your main application, you get immediate connectivity to a message broker and by adding #StreamListener to a method, you will receive events for stream processing.

Spring Cloud Dataflow on different message broker than Redis?

from what I can see in the documentation, always a Redis instance is needed for Spring Cloud Dataflow to work.
Is it also possible to work with different message broker, e.g. RabbitMQ?
How would one specify a different message broker during startup?
With the recent 1.0.0.M3 release, when using the Local server, we load kafka based OOTB applications, by default.
If you'd like to switch from kafka to rabbit, you can via --binder=rabbit as a command line argument when starting the spring-cloud-dataflow-server-local JAR (and be sure to start the RabbitMQ server).

Spring and Redis queue listener with annotation

There is a way in Spring to configure Redis queue listeners using annotations?
I would like something like Annotation-based SQS Queue Listener from Spring Cloud for AWS, but using Redis as a queue.
Looking the documentation I can't find anything that fits well for me.
This feature is already implemented in Spring or I need implement it by my own?
Spring Cloud Stream has support for redis