How to send message to kafka producer using vertx eventbus - apache-kafka

I am learning kafka and vertx and I got across the following statements,
1.Kafka module allows to receive events published by other Vert.x verticles and send those events to Kafka broker.
2.Application sends messages to Kafka module using Vertx bus
3.Kafka module acts as a producer
Anyone letting me know how they are programmed, would be very helpful. Thanks.
I found the source code here, but I am looking for an simpler example. https://github.com/zanox/mod-kafka

That Kafka module only acts as a producer - not as a consumer. So it's intent is to publish messages external (outbound) from the system. The example on the referenced Github link is very easy to follow, if you follow along with the following Kafka quick start at the same time - http://kafka.apache.org/081/documentation.html#quickstart
You don't have to use modules, you can use the Kafka Java client within your Verticles. Modules are intended as a mechanism for re-use and providing common functionality.
In Vertx 3.0 (next release), the module system is removed anyway.

Related

Kafka Streams without Sink

I'm currently planning the architecture for an application that reads from a Kafka topic and after some conversion puts data to RabbitMq.
I'm kind new for Kafka Streams and they look a good choice for my task. But the problem is that Kafka server is hosted at another vendor's place, so I can't even install Cafka Connector to RabbitMq Sink plugin.
Is it possible to write Kafka steam application that doesn't have any Sink points, but just processes input stream? I can just push to RabbitMQ in foreach operations, but I'm not sure will Stream even work without a sink point.
foreach is a Sink action, so to answer your question directly, no.
However, Kafka Streams should be limited to only Kafka Communication.
Kafka Connect can be installed and ran anywhere, if that is what you wanted to use... You can also use other Apache tools like Camel, Spark, NiFi, Flink, etc to write to RabbitMQ after consuming from Kafka, or write any application in a language of your choice. For example, the Spring Integration or Cloud Streams frameworks allows a single contract between many communication channels

Build a data transformation service using Kafka Connect

Kafka Streams is good, but I have to do every configuration very manual. Instead Kafka Connect provides its API interface, which is very useful for handling the configuration, as well as Tasks, Workers, etc...
Thus, I'm thinking of using Kafka Connect for my simple data transforming service. Basically, the service will read the data from a topic and send the transformed data to another topic. In order to do that, I have to make a custom Sink Connector to send the transformed data to the kafka topic, however, it seems those interface functions aren't available in SinkConnector. If I can do it, that would be great since I can manage tasks, workers via the REST API and running the tasks under a distributed mode (multiple instances).
There are 2 options in my mind:
Figuring out how to send the message from SinkConnector to a kafka topic
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app
Any ideas?
Figuring out how to send the message from SinkConnector to a kafka topic
A sink connector consumes data/messages from a Kafka topic. If you want to send data to a Kafka topic you are likely talking about a source connector.
Figuring out how to build a REST interface API like Kafka Connect which wraps up the Kafka Streams app.
using the kafka-connect-archtype you can have a template to create your own Kafka connector (source or sink). In your case that you want to build some stream processing pipeline after the connector, you are mostly talking about a connector of another stream processing engine that is not Kafka-stream. There are connectors for Kafka <-> Spark, Kafka <-> Flink, ...
But you can build your using the template of kafka-connect-archtype if you want. Use the MySourceTask List<SourceRecord> poll() method or the MySinkTask put(Collection<SinkRecord> records) method to process the records as stream. They extend the org.apache.kafka.connect.[source.SourceTask|sink.SinkTask] from Kafka connect.
a REST interface API like Kafka Connect which wraps up the Kafka Streams app
This is exactly what KsqlDB allows you to do
Outside of creating streams and tables with SQL queries, it offers a REST API as well as can interact with Connect endpoints (or embed a Connect worker itself)
https://docs.ksqldb.io/en/latest/concepts/connectors/

Nest.JS CQRS: Is there way to make Command Bus and Queue Is external by using Kafka or RabbitMQ under the hood? Thanks

I am trying to use https://github.com/nestjs/cqrs Nest.JS module. Seems there is impossible to make CommandBus and QueryBus implementations external like using Kafka or RabbitMQ under the hood and share command handling between many microservices. Am I wrong?
Thanks.
Kafka support was added in 6.7.0
https://github.com/nestjs/nest/issues/2361
As documented, the command bus is observable, so you could consume the events and forward them into a Kafka producer or consumer, as you need to

Kafka on Nest.JS

from the official Nest.JS docs we can see that
We don't support streaming platforms with log based persistance, such
as Kafka or NATS streaming because they have been created to solve a
different range of issues.
However, you can see here how to create a microservice with NATS https://docs.nestjs.com/microservices/nats which theorically is not supported as seen above.
I would like to use Kafka with Nest. Is it / Will it be supported?
Thank you in advance!
Kafka message broker is an ecosystem that container whole codebase and other actor of your project.
All microservice pattern that mentions in Basics TCP, Redis InMemCacH, MQTT, NATS, gRPC can abstract in Nest.js core scaffold.
If you want use KAFKA or RABBITMQ may restructure your project and change it some applications that should separated and work together with message and message queue.

Implement Kafka Streams Processor in .Net?

Is that possible?
The official .Net client confluent-kafka-dotnet only seems to provide consumer and producer functionality.
And (from what I remember looking into Kafka streams quite a while back) I believe Kafka Streams processors always run on the JVMs that run Kafka itself. In that case, it would be principally impossible.
Yes, it is possible to re-implement Apache Kafka's Streams client library (a Java library) in .NET. But at the moment there doesn't exist such a ready-to-use Kafka Streams implementation for .NET.
And (from what I remember looking into Kafka streams quite a while back) I believe Kafka Streams processors always run on the JVMs that run Kafka itself. In that case, it would be principally impossible.
No, Kafka Streams "processors" as you call them do not run in (the JVMs of) the Kafka brokers, which would be server-side.
Instead, the Kafka Streams client library is used to implement client-side Java/Scala/Clojure/... applications for stream processing. These applications talk to the Kafka brokers (which form the Kafka cluster) over the network.
May 2020 there seems to be a project in the making to support Kafka Streams in .NET:
https://github.com/LGouellec/kafka-stream-net
As per their road-map they are now in early beta and intend to get to v1 but the end of the year or beginning of next