Regd. WSO2 ESB Kafka Consumer Project - apache-kafka

I am creating two WSO2 ESB Integration projects for the purpose of one for Kafka Message Producer and second for Kafka Message Consumer. I created Kafka Message Producer successfully on invocation of REST API and message going to be dropped on the topic. When I am creating Kafka message consumer there is no transport such as Kafka for proxy to listen on and as per the documentation we need to use IP endpoint for the same. My requirement is Kafka Message should be received at consumer automatically from a topic (similar to JMS Consumer) when message is available on the topic. So how can i achieve that using IP endpoint with Kafka details.
any inputs?

If you want to consume messages in a Kafka topic using an EI server, you need to use Kafka inbound endpoint. This inbound endpoint will periodically poll the messages from the topic and the polling interval is configurable. Refer to the documentation [1] for more information on this.
[1]-https://docs.wso2.com/display/EI660/Kafka+Inbound+Protocol

I resolved the issue with separate wso2 consumer Integration project. Its getting issues while creating the inbound endpoint type as kafka.

Related

Kafka Connector To read from a Topic and write to a topic

I want to build a Kafka connector which needs to read from the Kafka topic and make a call to the GRPC service to get some data and write the whole data into another kafka topic.
I have written a Kafka Sink connector which reads from a topic and called a GRPC service. But not sure how to redirect this data into a Kafka topic.
Kafka Streams can read from topics, call external services as necessary, then forward this data to a new topic in the same cluster.
MirrorMaker2 can be used between different clusters, but using Connect transforms is generally not recommended with external services.
Or you could make your gRPC service into a Kafka producer.

Can kafka publish messages to AWS lambda

I have to publish messages from a kafka topic to lambda to process them and store in a database using a springboot application, i did some research and found something to consume messages from kafka
public Function<KStream<String, String>, KStream<String, String>> process(){} however, im not sure if this is only used to publish the consumed messages to another kafka topic or can be used as an event source to lambda, I need some guidance on consuming and converting the consumed kafka message to event source.
Brokers do not push. Consumers always poll.
Code shown is for Kafka Streams API, which primarily writes to new Kafka topics. While you could fire HTTP events to start a lambda, that's not recommended.
Alternatively, Kafka is already supported as an event source. You don't need to write any consumer code.
https://aws.amazon.com/about-aws/whats-new/2020/12/aws-lambda-now-supports-self-managed-apache-kafka-as-an-event-source/
This is possible from MSK or a self managed Kafka
process them and store in a database
Your lambda could process the data and send to a new Kafka topic using a producer. You can then use MSK Connect or run your own Kafka Connect cluster elsewhere to dump records into a database. No Spring/Java code would be necessary.

Configure Kafka consumer with EJB? Is it possible to configure mdb for Kafka with EJB?

I want to replace existing JMS message consumer in EJB with Apache Kafka message consumer. I am not able to figure out the option to configure Apache Kafka consumer with EJB configuration.
Kafka is not a straight messaging solution (comparable to say RabbitMQ), but can be used as one.
You will need to translate your JMS concepts (topics, queues) into Kafka topics (which are closer to JMS topics).
Also given that consumers have a configureable/storeable start offset you will need to define these policies for your consumers.

How to Conect MQTT broker to a knative kafka source

Basically I want to send messages from a MQTT(mosquito) broker to a knative event source(kafka) . In case of a simple kafka broker I could use the confluent's kafkaconnect but in this case it's a knative event source rather than a broker. The problems lies with conversion to cloud events.
Since you have a current MQTT broker which can read/write to Kafka, you might the the Kafka source to convert the Kafka messages to CloudEvents and send them to your service.
If you're asking about how to connect the MQTT broker with Kafka, I'd suggest either finding or writing an MQTT source or using something outside the Knative ecosystem.

how to call a springboot service endpoint from apache kafka and pass all the messages from a topic?

I am new to kafka technology and I have a requirement of fetch all the realtime data from a DB and pass it to a springboot microservice for its processing. in my analysis found that apache kafka with kafka source connect can pull all real time data from the DB to kafka Topics.
Can someone tell is there any way to pick this data form kafka topics and share to microservice by trigger a restcall from the kafka service ?
The idea is whenever a new entry added to the database table kafka can pull that data via kafka connect and somehow kafka should call the microservice and share this new entry. is it possible with kafka ?
Database --> kafka connect --> kafka (Topic) ---> some service that call microservice ---> microservice
form kafka topics and share to microservice
Kafka doesn't push. You would add a consumer in your service to pull from Kafka, perhaps using spring-kafka or spring-cloud-streams
Alternatively, Kafka Connect Sink could be used with an HTTP POST connector, but then you'd need to somehow deal with not committing offsets for messages that have failed requests.