Is it possible to connect cometD client with Kafka producer? Any suggestions?
Currently I am having a CometD client in python which is extracting data real time from a Salesforce object.
Now I want to push that data into Kafka producer. Is it possible to do that? And how?
Solved.
By using https://github.com/dkmadigan/python-bayeux-client to extract the events from Salesforce, I was able to push into the Kafka broker.
Related
I have a question about connection between Eclipse Ditto, MQTT and Kafka.
It is possible to received data on Ditto via MQTT broker and then send this data on a kafka topic?
I have created a connection source on Ditto to consume message from broker but now I'm stuck on kafka side,
I think that I need to create a MQTT target connection to send data on a different topic and then create a connection source and target for Kafka, so in this way with the first type of connection (source) I can consume message from the topic and with target connection I can send the data on another topic.
Summary:
I need to consume data from MQTT broker with Ditto and then send this data on a kafka topic to store them in a time series db (InfluxDB).
I hope is all clear, any suggestions will be helpful, thanks!
I am new to Apache Kafka and I'm trying to build a Python app which is able to handle Kafka messages. I've set Kafka up to produce and consume messages locally. Now I also want this to work non-locally, so that I can send messages from everywhere to my Python app.
My idea was to just expose the specific port that Kafka is using by using Localtunnel. I thought this would just mirror the local messages, so that I can consume them via the generated URL. But surprise, it doesn't work.
I just don't receive any messages at all. Do you have an idea why this is? Do I maybe have to configure the listeners in the Kafka server.properties first?
Thanks!
Kafka Streams is good, but I have to do every configuration very manual. Instead Kafka Connect provides its API interface, which is very useful for handling the configuration, as well as Tasks, Workers, etc...
Thus, I'm thinking of using Kafka Connect for my simple data transforming service. Basically, the service will read the data from a topic and send the transformed data to another topic. In order to do that, I have to make a custom Sink Connector to send the transformed data to the kafka topic, however, it seems those interface functions aren't available in SinkConnector. If I can do it, that would be great since I can manage tasks, workers via the REST API and running the tasks under a distributed mode (multiple instances).
There are 2 options in my mind:
Figuring out how to send the message from SinkConnector to a kafka topic
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app
Any ideas?
Figuring out how to send the message from SinkConnector to a kafka topic
A sink connector consumes data/messages from a Kafka topic. If you want to send data to a Kafka topic you are likely talking about a source connector.
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app.
using the kafka-connect-archtype you can have a template to create your own Kafka connector (source or sink). In your case that you want to build some stream processing pipeline after the connector, you are mostly talking about a connector of another stream processing engine that is not Kafka-stream. There are connectors for Kafka <-> Spark, Kafka <-> Flink, ...
But you can build your using the template of kafka-connect-archtype if you want. Use the MySourceTask List<SourceRecord> poll() method or the MySinkTask put(Collection<SinkRecord> records) method to process the records as stream. They extend the org.apache.kafka.connect.[source.SourceTask|sink.SinkTask] from Kafka connect.
a REST interface API like Kafka Connect which wraps up the Kafka Streams app
This is exactly what KsqlDB allows you to do
Outside of creating streams and tables with SQL queries, it offers a REST API as well as can interact with Connect endpoints (or embed a Connect worker itself)
https://docs.ksqldb.io/en/latest/concepts/connectors/
I have a web server which receives some events as post request , I want to process using Kafka Streams. Which Source connector can I use to achieve this ?
A Source Connector reads data from a source (your server, in this case), which would be GET requests...
If you want POST requests from Connect (consuming Kafka then sending requests), that would be a Sink Connector, and has nothing to do with Kafka Streams
relevant: https://github.com/llofberg/kafka-connect-rest
If that doesn't meet your needs, you can write your own Connector
I am new to kafka technology and I have a requirement of fetch all the realtime data from a DB and pass it to a springboot microservice for its processing. in my analysis found that apache kafka with kafka source connect can pull all real time data from the DB to kafka Topics.
Can someone tell is there any way to pick this data form kafka topics and share to microservice by trigger a restcall from the kafka service ?
The idea is whenever a new entry added to the database table kafka can pull that data via kafka connect and somehow kafka should call the microservice and share this new entry. is it possible with kafka ?
Database --> kafka connect --> kafka (Topic) ---> some service that call microservice ---> microservice
form kafka topics and share to microservice
Kafka doesn't push. You would add a consumer in your service to pull from Kafka, perhaps using spring-kafka or spring-cloud-streams
Alternatively, Kafka Connect Sink could be used with an HTTP POST connector, but then you'd need to somehow deal with not committing offsets for messages that have failed requests.