Quarkus + Kafka Intercept incoming messages and read headers - apache-kafka

Is there any way to create a Interceptor for a INCOMING message from Kafka, using the smallrye connector in quarkus ? I need this to read the headers and switch Tenants for persisting data.
In spring we can create a consumer interceptor and register it in the app.
For quarkus i see there is a OUTGOING interceptor, but none for a incoming kafka message.

Related

Wrapping kafka asyc mechanism with a sync http proxy (envoy)

Is it possíble to setup envoy to receive HTTP request send it to a kafka topic wait for response related to that message from another topic(s) and create http response based on the kafka response?
I found this implementation for bridging Sync HTTP to Async Kafka but I think there is some more enterprise solution for such a generic problem:
kafka - http bridge

Is it possible to consume events in kafka from custom keycloak spi

I was exploring on side like whether create realm action can be done via event architecture flow.
ie. If a service publish a create realm event to a kafka topic
A custom keycloak SPI provider should listen to the topic from kafka
Up on receiving realm creation event it need to create a realm in keycloak
Is that possible to listen continuously to a kafka topic from keycloak spi?

Regd. WSO2 ESB Kafka Consumer Project

I am creating two WSO2 ESB Integration projects for the purpose of one for Kafka Message Producer and second for Kafka Message Consumer. I created Kafka Message Producer successfully on invocation of REST API and message going to be dropped on the topic. When I am creating Kafka message consumer there is no transport such as Kafka for proxy to listen on and as per the documentation we need to use IP endpoint for the same. My requirement is Kafka Message should be received at consumer automatically from a topic (similar to JMS Consumer) when message is available on the topic. So how can i achieve that using IP endpoint with Kafka details.
any inputs?
If you want to consume messages in a Kafka topic using an EI server, you need to use Kafka inbound endpoint. This inbound endpoint will periodically poll the messages from the topic and the polling interval is configurable. Refer to the documentation [1] for more information on this.
[1]-https://docs.wso2.com/display/EI660/Kafka+Inbound+Protocol
I resolved the issue with separate wso2 consumer Integration project. Its getting issues while creating the inbound endpoint type as kafka.

REST Producer Connector for Kafka Stream

I have a web server which receives some events as post request , I want to process using Kafka Streams. Which Source connector can I use to achieve this ?
A Source Connector reads data from a source (your server, in this case), which would be GET requests...
If you want POST requests from Connect (consuming Kafka then sending requests), that would be a Sink Connector, and has nothing to do with Kafka Streams
relevant: https://github.com/llofberg/kafka-connect-rest
If that doesn't meet your needs, you can write your own Connector

how to call a springboot service endpoint from apache kafka and pass all the messages from a topic?

I am new to kafka technology and I have a requirement of fetch all the realtime data from a DB and pass it to a springboot microservice for its processing. in my analysis found that apache kafka with kafka source connect can pull all real time data from the DB to kafka Topics.
Can someone tell is there any way to pick this data form kafka topics and share to microservice by trigger a restcall from the kafka service ?
The idea is whenever a new entry added to the database table kafka can pull that data via kafka connect and somehow kafka should call the microservice and share this new entry. is it possible with kafka ?
Database --> kafka connect --> kafka (Topic) ---> some service that call microservice ---> microservice
form kafka topics and share to microservice
Kafka doesn't push. You would add a consumer in your service to pull from Kafka, perhaps using spring-kafka or spring-cloud-streams
Alternatively, Kafka Connect Sink could be used with an HTTP POST connector, but then you'd need to somehow deal with not committing offsets for messages that have failed requests.