One JMS message copied to two queues - queue

How do I configure activemq such that a JMS message published to a topic is passed on to two JMS queue.
Is this possible in activemq?
Or
Is it better to use a simple topic with two subscribers. Both picking up their own copy of a message.

Instead of trying to configure the Broker to do this you are better off using Apache Camel to create the routing behaviour you are looking for. Camel routes can be embedded in you ActiveMQ instance.

Related

Configure Kafka consumer with EJB? Is it possible to configure mdb for Kafka with EJB?

I want to replace existing JMS message consumer in EJB with Apache Kafka message consumer. I am not able to figure out the option to configure Apache Kafka consumer with EJB configuration.
Kafka is not a straight messaging solution (comparable to say RabbitMQ), but can be used as one.
You will need to translate your JMS concepts (topics, queues) into Kafka topics (which are closer to JMS topics).
Also given that consumers have a configureable/storeable start offset you will need to define these policies for your consumers.

Kafka JMS Source Connector write message to more topics

I have an ActiveMQ Artemis JMS queue and there is a Kafka Source Connector. I want to write message from this queue to multiple topics parallel. I found that the Simple Message Transform could be the solution. I tried to configure RegexRouter, but I can only change the name of the topic?
I tried to create 3 connector instances but only one can receive message. I guess because message is deleted from queue at the first read.

Apache Camel vs Apache Kafka [duplicate]

This question already has answers here:
Difference Between Apache Kafka and Camel (Broker vs Integration)
(4 answers)
Closed 5 years ago.
As far as I know, Apache Kafka is asynchronous messaging platform, where as Apache Camel is a platform implementing the enterprise integration patterns.
So, what are the practical differences of Apache Camel and Apache Kafka? We planned to implement the system with Apache Camel, which is relatively easy, but our customer wanted the Apache Kafka instead without rational.
What would be the advantages of choosing Apache Kafka to implement a message queue functionality, which could be implemented with Apache Camel as well? I'm concerned Kafka would just introduce unnecessary overhead to project. Are we comparing apples and oranges?
What we need is straightforward API's to setup and use clustered message queues. Our initial plan was to use Camel to consume/produce on clustered JMS or ActiveMQ queues. How would Kafka make this task easier? The application itself would run on WebLogic server on either case.
The messaging would be point-to-point type, where there are multiple instances of same service running, but only one instance should process the message and emit the result according to load balancing policy. Message queues are also clustered, so neither failure of service instance or queue instance is SPOF.
Camel and Kafka are totally different things. In many use cases, camel is just used as a client of kafka/activemq/... .
Kafka and activemq are similar, but also different things, refer What is the difference between Apache kafka vs ActiveMQ. Kafka has higher throughput, and data always on disk, so a little more reliable than activemq.
Kafka is usually used as real-time data streaming, and in general activemq is mainly used for integration between applications, the book says so. But in most real world cases ,kafka and activemq can replace each other easily.
It is very hard to compare those two. They are not covering the same areas of work, but exist some systems, where you can replace one by the other.
So very shortly.
Kafka is messaging platform with streaming ability to process messages Apache Kafka.
Camel is ETL framework it can transform messages/events/data from "any" (see endpoint list by Camel) input point and send it to "any" output Apache Camel - Enterprise Integration Patterns.
You may use Camel without Kafka at all, and vice versa. But there are of course possibilities to use succesfully both together.
Case 1. You process mail and store in PostgreSQL DB. Kafka is useless
here.
Case 2. You process messages from ActiveMQ and send them to
Kafka. You may use both.

Connect Akka Streams to JMS

I'm trying to connect to a Universal Messaging queue (by Software AG) via Akka Streams. I have looked in the doc of Akka Streams regarding the Camel integrations, but I'm struggling with understanding how the components fit together. For instance, do I have to use ActiveMQ as a broker?
I have previously set up a connection via MQTT (and Spark's MQTTUtils) but since I want to try out Akka I don't think MQTT via TCP is necessary. [It is recommended}(http://tech.forums.softwareag.com/techjforum/posts/list/55887.page) that I use JMS instead of another protocol, especially with third-party tools. Hence my question regarding the proper setup of Akka Streams to UM via JMS.
The recommendation is to move away from Camel and leverage a fully reactive, akka-streams based solution. The Alpakka project was born to collect akka-streams compatible connectors in one bundle.
It currently contains a JMS connector (as well as AMQP and MQTT connectors). Further info:
Official launch article
Alpakka home
JMS connector docs

How to send message to kafka producer using vertx eventbus

I am learning kafka and vertx and I got across the following statements,
1.Kafka module allows to receive events published by other Vert.x verticles and send those events to Kafka broker.
2.Application sends messages to Kafka module using Vertx bus
3.Kafka module acts as a producer
Anyone letting me know how they are programmed, would be very helpful. Thanks.
I found the source code here, but I am looking for an simpler example. https://github.com/zanox/mod-kafka
That Kafka module only acts as a producer - not as a consumer. So it's intent is to publish messages external (outbound) from the system. The example on the referenced Github link is very easy to follow, if you follow along with the following Kafka quick start at the same time - http://kafka.apache.org/081/documentation.html#quickstart
You don't have to use modules, you can use the Kafka Java client within your Verticles. Modules are intended as a mechanism for re-use and providing common functionality.
In Vertx 3.0 (next release), the module system is removed anyway.