Is it possible to consume events in kafka from custom keycloak spi - keycloak

I was exploring on side like whether create realm action can be done via event architecture flow.
ie. If a service publish a create realm event to a kafka topic
A custom keycloak SPI provider should listen to the topic from kafka
Up on receiving realm creation event it need to create a realm in keycloak
Is that possible to listen continuously to a kafka topic from keycloak spi?

Related

Quarkus + Kafka Intercept incoming messages and read headers

Is there any way to create a Interceptor for a INCOMING message from Kafka, using the smallrye connector in quarkus ? I need this to read the headers and switch Tenants for persisting data.
In spring we can create a consumer interceptor and register it in the app.
For quarkus i see there is a OUTGOING interceptor, but none for a incoming kafka message.

Spring Cloud Data Flow Kafka Source

I am new to Spring Cloud Data Flow, and need to listen for messages on a topic from an external kafka cluster. This external kafka topic in confluent cloud would be my Source that I need to pass on to my Sink application.
I am also using kafka as my underlying message broker, which is a separate kafka instance that is deployed on kubernetes. I'm just not sure what is the best approach to connect to this external kafka instance. Is there an existing kafka Source app that I can use, or do I need to create my own Source application to connect to it? Or is it just some kind of configuration that I need to setup to get connected?
Any examples would be helpful. Thanks in advance!

Build a data transformation service using Kafka Connect

Kafka Streams is good, but I have to do every configuration very manual. Instead Kafka Connect provides its API interface, which is very useful for handling the configuration, as well as Tasks, Workers, etc...
Thus, I'm thinking of using Kafka Connect for my simple data transforming service. Basically, the service will read the data from a topic and send the transformed data to another topic. In order to do that, I have to make a custom Sink Connector to send the transformed data to the kafka topic, however, it seems those interface functions aren't available in SinkConnector. If I can do it, that would be great since I can manage tasks, workers via the REST API and running the tasks under a distributed mode (multiple instances).
There are 2 options in my mind:
Figuring out how to send the message from SinkConnector to a kafka topic
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app
Any ideas?
Figuring out how to send the message from SinkConnector to a kafka topic
A sink connector consumes data/messages from a Kafka topic. If you want to send data to a Kafka topic you are likely talking about a source connector.
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app.
using the kafka-connect-archtype you can have a template to create your own Kafka connector (source or sink). In your case that you want to build some stream processing pipeline after the connector, you are mostly talking about a connector of another stream processing engine that is not Kafka-stream. There are connectors for Kafka <-> Spark, Kafka <-> Flink, ...
But you can build your using the template of kafka-connect-archtype if you want. Use the MySourceTask List<SourceRecord> poll() method or the MySinkTask put(Collection<SinkRecord> records) method to process the records as stream. They extend the org.apache.kafka.connect.[source.SourceTask|sink.SinkTask] from Kafka connect.
a REST interface API like Kafka Connect which wraps up the Kafka Streams app
This is exactly what KsqlDB allows you to do
Outside of creating streams and tables with SQL queries, it offers a REST API as well as can interact with Connect endpoints (or embed a Connect worker itself)
https://docs.ksqldb.io/en/latest/concepts/connectors/

how to call a springboot service endpoint from apache kafka and pass all the messages from a topic?

I am new to kafka technology and I have a requirement of fetch all the realtime data from a DB and pass it to a springboot microservice for its processing. in my analysis found that apache kafka with kafka source connect can pull all real time data from the DB to kafka Topics.
Can someone tell is there any way to pick this data form kafka topics and share to microservice by trigger a restcall from the kafka service ?
The idea is whenever a new entry added to the database table kafka can pull that data via kafka connect and somehow kafka should call the microservice and share this new entry. is it possible with kafka ?
Database --> kafka connect --> kafka (Topic) ---> some service that call microservice ---> microservice
form kafka topics and share to microservice
Kafka doesn't push. You would add a consumer in your service to pull from Kafka, perhaps using spring-kafka or spring-cloud-streams
Alternatively, Kafka Connect Sink could be used with an HTTP POST connector, but then you'd need to somehow deal with not committing offsets for messages that have failed requests.

How can integrate Keycloak with kafka?

I have configured 3 nodes kafka cluster. Now we want to setup security with Keycloak for kafka. Please let me know what are the ways to do the same.
Question 1: How to implement security for kafka broker to kafka broker with keycloak?
Question 2: How to implement security for kafka client to kafka broker with keycloak?
Note: We had already Keycloak setup.
You can configure Kafka to use AUTHBEARER which is implemented in latest kafka release , You can find more info how to configure here .
And also get more information about the feature from Kafka doc
You need to implement org.apache.kafka.common.security.auth.AuthenticateCallbackHandler to get token from keycloak and validate token from Keycloak.