Integration between IBM MQ Series and ActiveMQ Artemis at enterprise level - activemq-artemis

We are currently using IBM MQ for messaging and moving to ActiveMQ, but the challenge is there are few producers who would continue to use IBM MQ for sometime whereas consumers are ready to migrate.
Is there a way we can bridge IBM MQ and ActiveMQ Artemis, so that any message that arrives in IBM MQ queue should get auto replicated in ActiveMQ and consumer pick up from there ( same goes in reverse order)?
Implementing a new service which consume from IBM MQ and put of ActiveMQ does not seem feasible as there are huge list of services and this redundant work does not seems feasible at enterprise level.

Related

Triggering kubernetes job for a kafka message

I have a kubernetes service that only does something when it consumes a message from a Kafka queue. The queue does not have messages very often, and running the service as a job triggered whenever a message is found would save resources.
I see that Kubernetes has this functionality for AMQP-type message services: https://kubernetes.io/docs/tasks/job/coarse-parallel-processing-work-queue/
Is there a way to adapt this for Kafka, given that Kafka does not support AMQP? I'd switch to a different messaging system, but I have other services that also read from this queue that require Kafka.
That Kafka consumer Service is all you really need. If you want to save resources, this could be paired with KEDA autoscaler such that it scales up and down, depending on load or consumer group lag.
Or you can use serverless platforms such as KNative to trigger based on Kafka (or other messaging systems) events.
Kafka does not support AMQP
Kafka Connect should be able to bridge AMQP to Kafka. E.g. Apache Camel has connectors for both.

Configure ActiveMQ Artemis message redelivery on the client side

I wonder if it is possible to configure message redelivery on the client side. I have read the ActiveMQ Artemis docs and have not found any information about this feature. So I made a conclusion that there is no opportunity to configure message redelivery on the client side. The only place to configure message redelivery is the broker.xml file. Am I right about it?
By the way I can configure the connection to ActiveMQ Artemis by using broker URL params or by application.yml since I using Spring Boot 2.x.
ActiveMQ Artemis supports AMQP, STOMP, MQTT, OpenWire, etc. Many clients exist for these protocols written in lots of different languages across all kinds of platforms. Whether or not a given client supports client-side redelivery is really up to the client itself. You don't specify which client you're using so it's impossible to give you a specific yes/no answer.
However, I can say that ActiveMQ Artemis ships a JMS client implementation which uses the core protocol. That client does not support client-side redelivery. However, the OpenWire JMS client shipped with ActiveMQ "Classic" does support client-side redelivery, and it can be used with ActiveMQ Artemis as well.

ActiveMQ Browser setup for ActiveMQ Artemis

I am trying install ActiveMQ Browser, and I wanted to connect with my ActiveMQ Artemis server. How do we configure that?
I assume you're talking about this ActiveMQ Browser GUI tool.
If that assumption is correct then there's no way to integrate it with ActiveMQ Artemis as it's hard-coded to use the specific JMX management beans from ActiveMQ 5.x.
I recommend you use the ActiveMQ Artemis web console. It has a rich set of functionality that should cover most of the use-cases you're interested in. Among other things, it will allow you to:
Send new messages to addresses.
Delete messages.
Move messages to another address.
Create or delete addresses & queues.
Shutdown broker.
etc.

Apache Camel vs Apache Kafka [duplicate]

This question already has answers here:
Difference Between Apache Kafka and Camel (Broker vs Integration)
(4 answers)
Closed 5 years ago.
As far as I know, Apache Kafka is asynchronous messaging platform, where as Apache Camel is a platform implementing the enterprise integration patterns.
So, what are the practical differences of Apache Camel and Apache Kafka? We planned to implement the system with Apache Camel, which is relatively easy, but our customer wanted the Apache Kafka instead without rational.
What would be the advantages of choosing Apache Kafka to implement a message queue functionality, which could be implemented with Apache Camel as well? I'm concerned Kafka would just introduce unnecessary overhead to project. Are we comparing apples and oranges?
What we need is straightforward API's to setup and use clustered message queues. Our initial plan was to use Camel to consume/produce on clustered JMS or ActiveMQ queues. How would Kafka make this task easier? The application itself would run on WebLogic server on either case.
The messaging would be point-to-point type, where there are multiple instances of same service running, but only one instance should process the message and emit the result according to load balancing policy. Message queues are also clustered, so neither failure of service instance or queue instance is SPOF.
Camel and Kafka are totally different things. In many use cases, camel is just used as a client of kafka/activemq/... .
Kafka and activemq are similar, but also different things, refer What is the difference between Apache kafka vs ActiveMQ. Kafka has higher throughput, and data always on disk, so a little more reliable than activemq.
Kafka is usually used as real-time data streaming, and in general activemq is mainly used for integration between applications, the book says so. But in most real world cases ,kafka and activemq can replace each other easily.
It is very hard to compare those two. They are not covering the same areas of work, but exist some systems, where you can replace one by the other.
So very shortly.
Kafka is messaging platform with streaming ability to process messages Apache Kafka.
Camel is ETL framework it can transform messages/events/data from "any" (see endpoint list by Camel) input point and send it to "any" output Apache Camel - Enterprise Integration Patterns.
You may use Camel without Kafka at all, and vice versa. But there are of course possibilities to use succesfully both together.
Case 1. You process mail and store in PostgreSQL DB. Kafka is useless
here.
Case 2. You process messages from ActiveMQ and send them to
Kafka. You may use both.

Message from Apache Kafka to IBM MQ using IBM Integration bus

Since IIB v10.0.0.7 I can use KafkaConsumer node to receive messages that was published on a Kafka topic.
I need some client which will be able to recieve message from Kafka and put it in IBM MQ and get message from IBM MQ and publish in to Kafka topic. I alredy have IIB and IBM MQ. Kafka is messaging system of one of integration systems.
Can I somehow put received message from Kafka in IBM queue using KafkaConsumer node and MQOutput node ? Or get message from queue with MQOutput and publish it to Kafka topic with KafkaProducer node ?
Or it is not a good idea to mix this tecnologies in such a way and should look for some another workaround ?
Hi you could use Kafka Connect connectors.
https://www.confluent.io/product/connectors/
There are community connectors for MQ.
Alternatively, if you're using IBM MessageHub, i.e. Kafka-as-a-service in the IBM Cloud, there you can have an MQ-to-Kafka bridge ran as a service itself.
https://console.bluemix.net/docs/services/MessageHub/messagehub088.html#bridges
I hear this question every week...
The article “Apache Kafka vs. Enterprise Service Bus (ESB)—Friends, Enemies, or Frenemies? (https://www.confluent.io/blog/apache-kafka-vs-enterprise-service-bus-esb-friends-enemies-or-frenemies/)” discusses why Kafka is not competitive but complementary to integration and messaging solutions (including IBM MQ) and how to integrate both.
IIB can write to IBM MQ and one could IBM MQ source connector to write to kafka.
https://docs.confluent.io/kafka-connect-ibmmq-source/current/
Usage of kafka or IIB will be use to use dependent. Kafka is your messaging platform with persistence, ability connect to different sources and sinks and if needed enrich the messages on the fly in real/near realtime.