Using Kafka and Kafka REST Proxy server - apache-kafka

I have two ends of an application:
Python Flask backend, which communicates using Kafka in the normal way.
A machine agent written in Go,installed on a client environment and communicates with Kafka only via Kafka REST proxy.
Now the question is can these two ends communicate? For example can my machine agent consume messages from Kafka via the REST proxy,with messages produced from the other end in the normal way? Or do both ends need to use Kafka REST Proxy?

As long as data arrives in Kafka, it doesn't matter what protocol-hops you're using to get it there. I'd recommend using sarama or confluent-kafka-go instead of HTTP, though

Related

Send Apache Kafka messages through localtunnel.me

I am new to Apache Kafka and I'm trying to build a Python app which is able to handle Kafka messages. I've set Kafka up to produce and consume messages locally. Now I also want this to work non-locally, so that I can send messages from everywhere to my Python app.
My idea was to just expose the specific port that Kafka is using by using Localtunnel. I thought this would just mirror the local messages, so that I can consume them via the generated URL. But surprise, it doesn't work.
I just don't receive any messages at all. Do you have an idea why this is? Do I maybe have to configure the listeners in the Kafka server.properties first?
Thanks!

Build a data transformation service using Kafka Connect

Kafka Streams is good, but I have to do every configuration very manual. Instead Kafka Connect provides its API interface, which is very useful for handling the configuration, as well as Tasks, Workers, etc...
Thus, I'm thinking of using Kafka Connect for my simple data transforming service. Basically, the service will read the data from a topic and send the transformed data to another topic. In order to do that, I have to make a custom Sink Connector to send the transformed data to the kafka topic, however, it seems those interface functions aren't available in SinkConnector. If I can do it, that would be great since I can manage tasks, workers via the REST API and running the tasks under a distributed mode (multiple instances).
There are 2 options in my mind:
Figuring out how to send the message from SinkConnector to a kafka topic
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app
Any ideas?
Figuring out how to send the message from SinkConnector to a kafka topic
A sink connector consumes data/messages from a Kafka topic. If you want to send data to a Kafka topic you are likely talking about a source connector.
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app.
using the kafka-connect-archtype you can have a template to create your own Kafka connector (source or sink). In your case that you want to build some stream processing pipeline after the connector, you are mostly talking about a connector of another stream processing engine that is not Kafka-stream. There are connectors for Kafka <-> Spark, Kafka <-> Flink, ...
But you can build your using the template of kafka-connect-archtype if you want. Use the MySourceTask List<SourceRecord> poll() method or the MySinkTask put(Collection<SinkRecord> records) method to process the records as stream. They extend the org.apache.kafka.connect.[source.SourceTask|sink.SinkTask] from Kafka connect.
a REST interface API like Kafka Connect which wraps up the Kafka Streams app
This is exactly what KsqlDB allows you to do
Outside of creating streams and tables with SQL queries, it offers a REST API as well as can interact with Connect endpoints (or embed a Connect worker itself)
https://docs.ksqldb.io/en/latest/concepts/connectors/

Connecting to topics using Rest proxy

I am new to Kafka .I have implemented my consumer as normal Java springboot application.I need to connect to the topic deployed on remote broker using Kafka rest proxy.
I am not able to understand how it will function differently if i use Kafka rest proxy.Where i should do change in my code to include the rest proxy.Do i need to structure my code complete different as i didn't think about rest proxy while creation.
I maybe wrong with the terminologies.
Any help or guidance would be of great help.
REST proxy would be used with any HTTP client, not a Kafka consumer (so create a WebClient bean rather than a ConsumerFactory, etc)
You can refer its documentation for how you can consume records over HTTP, but, simply put, the code will be completely different up until you parse the data

Need help on Kafka to get data from web api GET Call

As of now we are using curl and GET call to get data from outside using their ENDPOINT URL.
i'm planning to setup a new process and is there anyway to leverage kafka here instead of CURL.
unfortunately we dont have kafka confluent version.
Kafka doesn't perform or accept HTTP calls.
You'd have to write some HTTP scraper, then combine this with a Kafka producer.
Note: You don't need Confluent Platform to just setup and run their REST proxy next to your existing brokers

Publish message to kafka via http

I'm new with kafka and I'm trying to publish data from external application via http but I cannot find the way to do this.
I already created a topic in kafka and test it producing and consuming the message but I don't know how to insert/publish message via http, I tried to invoke the following url to retrieve the topics but it does not retrieve any data http://servername:2181/topics/
I'm using cloudera 5.12.1.
You can access to your topics, if it was already created, using APIs. The easy way...(see client list)
Or see Connects Config to manage connectors by REST (rest.host.name, rest.port parameters). But only connectors...
To consume or produce message in a topic, use a middleware. IT is more feaseble.
Check out the open source Kafka REST Proxy from Confluent. It does exactly what you want.
You can get it standalone, or as part of Confluent Platform.