Difference between Spring Kafka lib and native Kafka Java API - apache-kafka

For a Java/Kotlin Spring boot app, if I want to send messages to Kafka or consume messages from Kafka. Would you recommend using Spring Kafka library or just using Kafka Java API.
Not quite sure are there any more benefits Spring provides or just a wrapper? For Spring they provide a lot of annotations which seems more magics when having some runtime error.
Want to hear some opinions.

Full disclosure: I am the project lead for Spring for Apache Kafka.
It's entirely up to you and your colleagues.
It's somewhat comparable to writing assembly code Vs. using a high level language and a compiler.
For an existing Spring shop that is familiar with spring-messaging (JMS, RabbitMQ etc), it's a natural fit, the APIs will be very familiar (POJO listeners, MessageConverters, KafkaTemplate, etc, etc).
When using the simplest APIs, Spring takes care of the low-level stuff like committing offsets, transaction synchronization, error handling, etc, etc.
If you have very basic requirements and/or want to write all that code yourself, then use the native APIs.

Related

Difference between client and driver program in Database

Context
I am looking for postgres jdbc drivers which supports reactive programming. I came across https://r2dbc.io/ which is a spec for reactive api's for jdbc.There are two sections in the site
one is "Clients" and another is "Drivers"
The client section starts with
R2DBC encourages libraries to provide a “humane” API in the form of a client library. R2DBC avoids implementing user-space features in each driver, and leaves these for specific clients to implement.
Postrgres implementation of r2dbc - https://github.com/pgjdbc/r2dbc-postgresql starts with
This implementation is not intended to be used directly, but rather to be used as the backing implementation for a humane client library to delegate to
My Questions
What is the difference between client and driver in general, or at-least in above context ?
What is the "humane api" being referred here ?
An example of client and human api in Spring is the DatabaseClient in Spring 5.3.
The original R2dbc spec defines the APIs using reactive streams spec. But DatabaseClient is based on the project reactor, which provides richer APIs for developers.
Compare my example connection factories(I have to use Reactor APIs to wrap the R2dbc APIs to make it more easy for use) and database clients.

REST Wrapper around Kafka- Anti-Pattern?

More than a question, this is an architectural dilemma that I am facing.
Is it a good idea to have REST wrapper around a Kafka Producer and integrate with it, instead of directly integrating with Kafka Producer in my code? I could use a generic interface for my higher classes, instead of using the KafkaImpl directly to keep it loosely coupled for the future.
If you have the option, I'd probably go for the pure-Kafka approach, since you'd get better throughput (the clients are very intelligent with respect to batching and futures).
I'm not sure you're decoupling your code by adding a rest wrapper; you're just adding another level of abstraction, adding maintenance burden and covering over some of the benefits of Kafka.
If you really need to use REST, you can make use of Kafka-Rest - no need to reinvent the wheel!
As mentioned kafka-rest-proxy will work.
I know plenty of people that wrap Kafka producer/consumers with Spring Kafka, Mirconaut, Akka, Quarkus, Lagom/Play just to name a few. Spring, specifically, has the messaging binders that can provide that "generic interface" feel.
These are all web frameworks, and putting an API / RPC abstraction layer on any code is definitely necessary in 12factor applications

Spring WebFlux REST API - Message Driven

I've recently been playing around with Spring Webflux and it looks extremely useful and efficient. Also, reading about Reactive Systems, it seems like one of the defining traits of such systems is that they are message-driven.
Came across this post on the web: https://www.captechconsulting.com/blogs/annotation-driven-reactive-web-apis-with-spring-webflux
This post also mentions,
Spring WebFlux contains support for Reactive HTTP Rest API(s),
WebSocket applications, and Server-Sent Events. Spring WebFlux is
responsive, resilient, scalable, and message-driven.
My question is that if a write a simple REST API, much like the post describes, performing CRUD operations backed by a MongoDB and using spring-boot-starter-data-mongodb-reactive, could I call my API service message-driven? I could also potentially add a Webclient to talk to some downstream services.
Does message driven in the context of a REST API even make sense?
No, your application is not message-driven instead your application are Reactive. Reactive applications is event-driven, non-blocking, scalable, resilient and elastic. It supports Publisher and Subscriber mechanism, means asynchronous communication is being done between Publisher and Subscriber. It supports two types of Publishers
Mono: Used when we produce only one item.
Flux: Used when we produce multiple items.
To make your application message-driven, you need to use any message broker like Kafka, RabbitMQ etc.

How to write Kafka RestProxy Server/Client for production use

Need to develop a rest API which can read published messages from kafka cluster to a dataware house application.
Materials available over internet say use POST/GET commands , but i don't think this is for production use rather useful for testing purposes.
How to implement it in scala/ Java Programming?
Materials available over internet say use POST/GET commands , but i don't think this is for production use rather useful for testing purposes
Please link to where you read this... All production web-services operate over (more than) these two HTTP methods, hundreds of thousands times a day...
If you want to really use Kafka for throughput, though, you wouldn't "hide it" behind a REST interface, though. You would distribute SSL certs plus usernames+passwords, to remote clients, for example.
Need to develop a rest API which can read publish messages from kafka
REST is not meant to keep an open connection, primarily because it is stateless (it shouldn't maintain where you are reading from in Kafka)... It would make more sense to forward a websocket from a Kafka consumer, which is different from a REST API.
how to implement it in scala/ Java Programming
The Confluent REST Proxy is already written in Java, and it is open-source (and used in Production at several companies, I believe). If you need inspiration, then you can start there. Otherwise, you can find examples of Spring and Vert.x, for example, with their Kafka integrations in their respective documentations, but you'll be re-implementing a lot of the existing functionality.

Difference Between Apache Kafka and Camel (Broker vs Integration)

I am trying to understand the differences between something like Kafka and something like Camel. To my understanding Camel would provide much more abstraction for developers without having to worry about changing protocols/systems to some extent. How would Kafka not be able to handle most of what Camel can do now? I am reading through the documentation and it seems like Kafka has been updated/upgraded enough to slightly break away from being a message broker only. I guess my question would really come down to how does Kafka compare to Camel in regards to future proofing systems and where does Kafka fall short of Camel? I am under the impression that Kafka doesn't scale as well as a system grows.
Edit: This is strictly based around messages.The documentation surrounding Camel makes it very clear that it's based around Enterprise Integration Patterns, but the deeper I dive into Kafka documentation the same patterns can be implemented. Am I missing something?
Apache Kafka : Is a streaming processing platform. It is based on massively scalable publish subscribe message queue architecture. There are many other platforms which are based on JMS publish subscribe model, which could do the same(with some exceptions). Some of the most popular are Apache-Activemq, RabbitMq
Apache Camel : Is a message oriented middleware. It has implemented almost all the Enterprise Integration Patterns.
You can use Apache Camel with Apache Kafka. Or you can use Apache Kafka without Apache Camel also.
They are two totally different things.
Think about Camel as an interface definition tool where you can define endpoints or channels where messages fly in. But they are abstract. Compare Camel with Spring Integration for instance.
Kafka can provide those messages, so it can implement those abstract channels or endpoints. But so can ActiveMQ and others.
Kafka is a message broker. It is comparable with other message brokers like ActiveMQ, RabbitMQ, Azure Service Bus etc. Camel is an integration middleware. It is more comparable to Apache ServiceMix.
Taking a look at the theory of an Event-Driven Architecture https://www.oreilly.com/library/view/software-architecture-patterns/9781491971437/ch02.html we could differentiate two different kinds of Event-driven topologies depending on whether we need an event mediator or not.
Message broker. In this category we find Kafka as it doesn't rely on a message mediator. Of course as written on previous answers, we could use Kafka together with a mediator depending on our needs.
Message mediator. In this category we find products like Camel. You may see it as a message controller.