Currently, I am using Confluent Rest Proxy 5.5.1 for collecting data in production. Duplicated events can come. I found the solution for de-duplication using Kafka Stream API. Is it possible to get an idempotent producer guarantee in Rest Proxy?
You can modify the REST Proxy's producer client with the producer. prefix
producer.enable.idempotence=true
Idempotent producers are not supported in Kafka REST. Ignoring 'enable.idempotence'. (io.confluent.kafkarest.KafkaRestConfig:880)
Related
As of now we are using curl and GET call to get data from outside using their ENDPOINT URL.
i'm planning to setup a new process and is there anyway to leverage kafka here instead of CURL.
unfortunately we dont have kafka confluent version.
Kafka doesn't perform or accept HTTP calls.
You'd have to write some HTTP scraper, then combine this with a Kafka producer.
Note: You don't need Confluent Platform to just setup and run their REST proxy next to your existing brokers
I'm new with kafka and I'm trying to publish data from external application via http but I cannot find the way to do this.
I already created a topic in kafka and test it producing and consuming the message but I don't know how to insert/publish message via http, I tried to invoke the following url to retrieve the topics but it does not retrieve any data http://servername:2181/topics/
I'm using cloudera 5.12.1.
You can access to your topics, if it was already created, using APIs. The easy way...(see client list)
Or see Connects Config to manage connectors by REST (rest.host.name, rest.port parameters). But only connectors...
To consume or produce message in a topic, use a middleware. IT is more feaseble.
Check out the open source Kafka REST Proxy from Confluent. It does exactly what you want.
You can get it standalone, or as part of Confluent Platform.
I am new to Kafka world. We are planning to set up Kafka to fulfill our data streaming needs. The sink in our case, is REST endpoint. What connectors are available to support Kafka => REST endpoint connectivity? This is similar to how AWS simple queues or topics work.
AFAIK there is no certified HTTP sink for Apache Kafka. Why not simply create a kafka consumer and for every message (or message batch) make a REST call to your service?
Nothing out of the box that I'm aware of but akka-streams feels well suited for this use-case.
The source would be a Kafka message stream created using akka-stream-kafka (aka reactive-kafka) and the sink the akka-http client (flow-based variant):
http://doc.akka.io/docs/akka-http/10.0.6/scala/http/client-side/request-level.html#flow-based-variant
Integration patterns for akka-streams are (starting to be) documented under the name alpakka:
http://developer.lightbend.com/docs/alpakka/current/
I have a requirement where messages are coming in json format from ACTIVEMQ and I have to expose end point where I can receive messages. I am using KAFKA but don't know whether can I use KAFKA REST API to receive json messages and store them against a particular topic in Kafka broker?
There is no REST API in the core Apache Kafka distribution but there is a very good open source REST Proxy for Kafka available from Confluent. It enables both publish and subscribe via REST APIs. The code and docs are on github at https://github.com/confluentinc/kafka-rest or you can download the entire Confluent open source platform which includes Apache Kafka, REST Proxy, Schema Registry and a few other useful enhancements at http://www.confluent.io/download/
We are trying to build a data-torrent application using kafka as a source that will consume messages from web UI using REST.
Can someone provide a good documentation or some steps how to setup kafka produce using java REST API ?
Thanks
Check out kafka-rest.
It will let you produce messages to a Kafka topic with a REST API in JSON or Avro. It is in Java and you may be able to use it out-of-the-box if you don't have special requirements.
Alternatively, if you don't want to use Confluent REST Proxy for Kafka, you can run a web application using a Web Framework which will call into a Kafka Client library during an HTTP request from the client.