from the official Nest.JS docs we can see that
We don't support streaming platforms with log based persistance, such
as Kafka or NATS streaming because they have been created to solve a
different range of issues.
However, you can see here how to create a microservice with NATS https://docs.nestjs.com/microservices/nats which theorically is not supported as seen above.
I would like to use Kafka with Nest. Is it / Will it be supported?
Thank you in advance!
Kafka message broker is an ecosystem that container whole codebase and other actor of your project.
All microservice pattern that mentions in Basics TCP, Redis InMemCacH, MQTT, NATS, gRPC can abstract in Nest.js core scaffold.
If you want use KAFKA or RABBITMQ may restructure your project and change it some applications that should separated and work together with message and message queue.
Related
I'm currently planning the architecture for an application that reads from a Kafka topic and after some conversion puts data to RabbitMq.
I'm kind new for Kafka Streams and they look a good choice for my task. But the problem is that Kafka server is hosted at another vendor's place, so I can't even install Cafka Connector to RabbitMq Sink plugin.
Is it possible to write Kafka steam application that doesn't have any Sink points, but just processes input stream? I can just push to RabbitMQ in foreach operations, but I'm not sure will Stream even work without a sink point.
foreach is a Sink action, so to answer your question directly, no.
However, Kafka Streams should be limited to only Kafka Communication.
Kafka Connect can be installed and ran anywhere, if that is what you wanted to use... You can also use other Apache tools like Camel, Spark, NiFi, Flink, etc to write to RabbitMQ after consuming from Kafka, or write any application in a language of your choice. For example, the Spring Integration or Cloud Streams frameworks allows a single contract between many communication channels
I am trying to use https://github.com/nestjs/cqrs Nest.JS module. Seems there is impossible to make CommandBus and QueryBus implementations external like using Kafka or RabbitMQ under the hood and share command handling between many microservices. Am I wrong?
Thanks.
Kafka support was added in 6.7.0
https://github.com/nestjs/nest/issues/2361
As documented, the command bus is observable, so you could consume the events and forward them into a Kafka producer or consumer, as you need to
This question already has answers here:
Difference Between Apache Kafka and Camel (Broker vs Integration)
(4 answers)
Closed 5 years ago.
As far as I know, Apache Kafka is asynchronous messaging platform, where as Apache Camel is a platform implementing the enterprise integration patterns.
So, what are the practical differences of Apache Camel and Apache Kafka? We planned to implement the system with Apache Camel, which is relatively easy, but our customer wanted the Apache Kafka instead without rational.
What would be the advantages of choosing Apache Kafka to implement a message queue functionality, which could be implemented with Apache Camel as well? I'm concerned Kafka would just introduce unnecessary overhead to project. Are we comparing apples and oranges?
What we need is straightforward API's to setup and use clustered message queues. Our initial plan was to use Camel to consume/produce on clustered JMS or ActiveMQ queues. How would Kafka make this task easier? The application itself would run on WebLogic server on either case.
The messaging would be point-to-point type, where there are multiple instances of same service running, but only one instance should process the message and emit the result according to load balancing policy. Message queues are also clustered, so neither failure of service instance or queue instance is SPOF.
Camel and Kafka are totally different things. In many use cases, camel is just used as a client of kafka/activemq/... .
Kafka and activemq are similar, but also different things, refer What is the difference between Apache kafka vs ActiveMQ. Kafka has higher throughput, and data always on disk, so a little more reliable than activemq.
Kafka is usually used as real-time data streaming, and in general activemq is mainly used for integration between applications, the book says so. But in most real world cases ,kafka and activemq can replace each other easily.
It is very hard to compare those two. They are not covering the same areas of work, but exist some systems, where you can replace one by the other.
So very shortly.
Kafka is messaging platform with streaming ability to process messages Apache Kafka.
Camel is ETL framework it can transform messages/events/data from "any" (see endpoint list by Camel) input point and send it to "any" output Apache Camel - Enterprise Integration Patterns.
You may use Camel without Kafka at all, and vice versa. But there are of course possibilities to use succesfully both together.
Case 1. You process mail and store in PostgreSQL DB. Kafka is useless
here.
Case 2. You process messages from ActiveMQ and send them to
Kafka. You may use both.
I have just started learning Kafka. So trying to build a social media web application. I am fairly clear on how to use Kafka for my backend ( communicating from backend to databases and other services).
However, I am not sure how should frontend communicate with backend. I was considering an architecture as: Frontend -> Kafka -> Backend.
Frontend acts as producer and backend as consumer. In this case, frontend would supposedly have all required resources to publish to Kafka broker (even if I implement security on Kafka). Now, is this scenario possible:
Lets say I impersonate the frontend and send absurd/invalid messages to my Kafka broker. Now I can handle and filter these messages when they reach to my backend. But I know that Kafka stores these messages temporarily. Wouldn't my Kafka server face DDOS problems if such "fake" messages are published to it in high volume, since it is gonna store them anyway as they dont get filtered out until they actually get consumed by backend?
If so, how can I prevent this?
Or is this not a good option? I can also try using REST for frontend/backend communication and then Kafka will be used from backend to communicate with database(s) and other stuff.
Or I can have a middleware (again, REST) that detects and filters out such messages.
Easiest way is to have the front end produce to the Kafka REST Proxy
See details here https://docs.confluent.io/1.0/kafka-rest/docs/intro.html
That way there is no kafka client code required in your front end and you can use HTTP(S) with standard off the shelf load balancers, and API Management tools.
Could you not consider the other direction, to use Kafka as a transport system for updating assets available to frontend ? This has been proposed for hybrid React / NodeJS/Express solutions.
I'm trying to connect to a Universal Messaging queue (by Software AG) via Akka Streams. I have looked in the doc of Akka Streams regarding the Camel integrations, but I'm struggling with understanding how the components fit together. For instance, do I have to use ActiveMQ as a broker?
I have previously set up a connection via MQTT (and Spark's MQTTUtils) but since I want to try out Akka I don't think MQTT via TCP is necessary. [It is recommended}(http://tech.forums.softwareag.com/techjforum/posts/list/55887.page) that I use JMS instead of another protocol, especially with third-party tools. Hence my question regarding the proper setup of Akka Streams to UM via JMS.
The recommendation is to move away from Camel and leverage a fully reactive, akka-streams based solution. The Alpakka project was born to collect akka-streams compatible connectors in one bundle.
It currently contains a JMS connector (as well as AMQP and MQTT connectors). Further info:
Official launch article
Alpakka home
JMS connector docs