SAP Enterprise messaging - Add new queues with listeners to existing queues on runtime - sap-cloud-platform

I have a usecase around SAP Enterprise messaging(Consume BusinessEvents from S4HC) to make it multitenant. For this, the approach is by making One queue per tenant and a particular queue would be subscribed to multiple business events of that tenant.
Currently, I have achieved the functionality to make it work/listen only for 1 queue with the following code. Note that all the events are asynchronous or non blocking calls with a listener class implemented.
#Bean
public Connection getSession(MessagingServiceJmsConnectionFactory connectionFactory) throws JMSException, InterruptedException {
Connection connection = connectionFactory.createConnection();
//connection.start();
Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
Queue queue = session.createQueue(QUEUE_PREFIX + QUEUE);
final MessageConsumer messageConsumer = session.createConsumer(queue);
messageConsumer.setMessageListener(new DefaultMessageListener());
connection.start();
Thread.sleep(5000);
return connection;
}
Approach is to create queues on subscription callbacks through service manager and to make the application listen to the new queue(add it to existing queues) without stopping/restarting the app.
How to get the connection factory session and add the new queues with the listener to make it dynamic using SpringBoot?
Can you help in this regard.

This code doesn't look like the SAP Cloud SDK. We checked twice and the mentioned classes are not found in our code. We had PoC support for Enterprise Messaging but deprecated it in favor of the planned release of the library for it by Cloud Application Programming model (CAP)
Check CAP's section on the Event Handlers in Java for more details. As far as I can see from the feature overview table CAP doesn't yet have support for multitenancy with the Java library. Implementation for Node.js is complete in that sense.
In SDK we plan to provide some convenience on top of the CAPs implementation when it's finalized. Their implementation
I think you can approach CAP via their support channels.

Related

Vertx WebClient shared vs single across multiple verticles?

I am using vert.x as an api gateway to route calls to downstream services.
As of now, I am using single web client instance which is shared across multiple verticles (injected through guice)
Does it make sense for each verticle to have it's own webclient? Will it help in boosting performance? (My each gateway instance runs 64 vericles and handles approximately 1000 requests per second)
What are the pros and cons of each approach?
Can someone help to figure out what's the ideal strategy for the same?
Thanks
Vert.x is optimized for using a single WebClient per-Verticle. Sharing a single WebClient instance between threads might work, but it can negatively affect performance, and could lead to some code running on the "wrong" event-loop thread, as described by Julien Viet, the lead developer of Vert.x:
So if you share a web client between verticles, then your verticle
might reuse a connection previously open (because of pooling) and you
will get callbacks on the event loop you won't expect. In addition
there is synchronization in the web client that might become contented
when used intensively from different threads.
Additionally, the Vert.x documentation for HttpClient, which is the underlying object used by WebClient, explicitly states not to share it between Vert.x Contexts (each Verticle gets its own Context):
The HttpClient can be used in a Verticle or embedded.
When used in a Verticle, the Verticle should use its own client
instance.
More generally a client should not be shared between different Vert.x
contexts as it can lead to unexpected behavior.
For example a keep-alive connection will call the client handlers on
the context of the request that opened the connection, subsequent
requests will use the same context.

Can Window Service be used to have business Logic

There is requirement in project and the windows service needs to be used as subscriber of RabbitMQ (message broker).
Once the event has created, this listener windows service get the event and process the event, while processing, there are some important business logics needs to be incorporated and the data needs to be stored into SQL server DB.
From my perspective, windows service can be just a trigger of any business logic. Like once it subscribed to an event, if any event comes, read the event details and perform the business logic using any of the REST (HTTP based) service.
Please provide your suggestions, it would be more helpful. Thanks in advance.
You need to create a Windows application using c# or you can use NUGET rabbitmq client to consume message and save in dB.
https://www.nuget.org/packages/RabbitMQ.Client

WSO2 CEP bidirectional REST API

I'm using wso2 cep 4.1
I created receiver to catch some json data from my source. Then I process this data internally and should give the response with additional data. My response should be through the same point as data come in to CEP. It is classical rest API. Is it possible and how can I make it?
Or, I need websocket (websocket-local) for similar purposes?
Hope you are still trying to understand functionalities of WSO2 CEP. Let me explain the basic overview of CEP before addressing your question. If you look at below diagram you will understand what is happening under the hood at a high level. . I will explain what these components suppose to do in the context of event processing.
Event receivers :-
Event receivers receive events that are coming to the CEP. WSO2 CEP supports the most common adapter implementations by default. For specific use cases, you can also plug custom adapters. For more information, see Configuring Event Receivers.
Event streams :-
Event streams contain unique sets of attributes of specific types that provide a structure based on which the events processed by the relevant event flow are selected. Event streams are stored as stream definitions in the file system via the data bridge stream definition store.
Event processors :-
Event processor handles actual event processing. It is the core event processing unit of the CEP. It manages different execution plans and processes events based on logic, with the help of different Siddhi queries. Event Processor gets a set of event streams from the Event Stream Manager, processes them using Siddhi engine, and triggers new events on different event streams back to the Event Stream Manager. For more information, see Creating a Standalone Execution Plan.
Event publishers
Event publishers:- publish events to external systems and store data to databases for future analysis. Like the event receivers, this component also has different adapter implementations. The most common ones are available by default in the CEP. You can implement custom adapters for specific use cases. For more information, see Configuring CEP to Create Alerts.
According to your requirement, you should have HTTP receiver as well as HTTP publisher where the receiver receives a request from a third party API and hand message over to event processors so as to perform some pre-defined tasks.This may compose with several event streams and execution plans. Once processing is done event publishers can be used to publish result to required third-party API as you pointed out.
OOB CEP provides HTTP receiver and HTTP publisher adapters[1-2] which you can try it out.There are some limitations which might not suit for your scenario. You are required to implement your own custom HTTP receiver and publisher[3-4] which does what you intended to do.
Since you need to publish a response to difference endpoints,you can achieve this defining REST API endpoint,user credentials(if required) and HTTP verbs and other information which required to send a message in the event stream[5] as meta information. Then that information you can read from the stream itself and push to desired third-party API as you require.
I need websocket (websocket-local) for similar purposes?
This isn't clear what exactly is to be done. Please raise an another question and ask it again.
https://docs.wso2.com/display/CEP410/HTTP+Event+Receiver
https://docs.wso2.com/display/CEP410/HTTP+Event+Publisher
https://docs.wso2.com/display/CEP410/Building+Custom+Event+Receivers
https://docs.wso2.com/display/CEP410/Building+Custom+Event+Publishers
https://docs.wso2.com/display/CEP410/Understanding+Event+Streams
The feature you are looking for doesn't come OOTB with CEP. However, you can try something similar to below;
Implement a REST API. Probably using Apache CXF since CXF dependencies are present in WSO2 servers by default. You can follow this guide if you are using a swagger based approach to develop the REST API.
Within that custom REST implementation, you need to read the HTTP request, send it to CEP (step 3), wait for an output from CEP (step 4) and then send back that details as HTTP response inside the method which represents your operation.
To send an event to CEP you can use WSO2 Event receiver. Create a receiver at CEP side and then send events to the receiver using DataPublisher client. Make sure you have the same stream definition that you set in CEP receiver in the DataPublisher.publish() method and object array you send adhere to that definition. Also, you might need to set truststore and keystore params here.
After publishing your events successfully you need to block the request thread till you receive a response from CEP. You can use a java object like CountDownLatch for this purpose.
To receive a response you need to consume events though EventStreamService For this you need to implement a WSO2EventConsumer and subscribe to EventStreamService. After successfully subscribing, events coming to stream id mentioned in your event consumer will be forwarded to receive method of your Consumer. From there you can extract the results, unblock the initial request thread and return with those results. To access the EventStreamService from within your web app you can use below code snippet.
EventStreamService eventStreamService = (EventStreamService) PrivilegedCarbonContext.getThreadLocalCarbonContext().getOSGiService(EventStreamService.class, null);
Hope this helped.

How to make a Pub/Sub service with CometD and Jetty

I need to create a Java 8- based service that provides a CometD channel that multiple clients can subscribe to. The idea is that the server can send notifications to the clients when certain events occur.
I am using Jetty 9 as my servlet container (necessary to meet the requirements for my group). I have been reading CometD documentation and looking for some kind of example that I can use. The documentation is extensive but isn't helping (lack of context), and I haven't been able to find a decent example of what I am trying to do.
Can someone provide a simple example of creating a publication mechanism, in Java, that can be used with Jetty? Failing that, can someone point me to an example of how to do it?
Please advise.
The CometD Project has an outstanding task to bring back the tutorials.
This particular question was answered by the server-side stock price tutorial, for which you can find the source here while we work on it to bring it back online as part of the documentation.
Glossing over a few details, the service you need to write is similar to the tutorial's stock price service: upon receiving an external event, the service should broadcast the event to subscribers.
#Service
public class StockPriceService implements StockPriceEmitter.Listener
{
#Inject
private BayeuxServer bayeuxServer;
#Session
private LocalSession sender;
public void onUpdates(List<StockPriceEmitter.Update> updates)
{
for (StockPriceEmitter.Update update : updates)
{
// Create the channel name using the stock symbol.
String channelName = "/stock/" + update.getSymbol().toLowerCase(Locale.ENGLISH);
// Initialize the channel, making it persistent and lazy.
bayeuxServer.createChannelIfAbsent(channelName, new ConfigurableServerChannel.Initializer()
{
public void configureChannel(ConfigurableServerChannel channel)
{
channel.setPersistent(true);
channel.setLazy(true);
}
});
// Convert the Update business object to a CometD-friendly format.
Map<String, Object> data = new HashMap<>(4);
data.put("symbol", update.getSymbol());
data.put("oldValue", update.getOldValue());
data.put("newValue", update.getNewValue());
// Publish to all subscribers.
ServerChannel channel = bayeuxServer.getChannel(channelName);
channel.publish(sender, data);
}
}
}
Class StockPriceEmitter is the source of your external events, and publishes them to StockPriceEmitter.Listener in form of StockPriceEmitter.Update events.
How the external events are relayed to the CometD server is the detail that StockPriceEmitter hides; it could be done via JMS messages, or by polling an external REST service, or via a custom network protocol, or by polling a database, etc.
The important thing is that when the external events arrive, StockPriceService.onUpdates(...) is called, and there you can convert the events into a CometD friendly JSON format, and then publish them to the CometD channel.
Publishing to the CometD channel, in turn, will send the message to all subscribers for that channel, typically remote clients such as browsers.
The CometD channel has been made lazy because it is a way to avoid bombing the clients with a very frequent update rate (say, higher than 2-4 updates per second).
You will need to decide about the lazyness of the channel based on your particular use case.

Stop Spring Cloud Stream #StreamListener from listening when target system is down

I have an application that gets messages from Kafka and calls a target system to update a legacy Oracle DB.
I want to enable a scenario where if the target system is down, to leave the messages on Kafka bus and not process them for a given period of time. I was thinking of some Circuit-breaker Hystrix-based solution, but I can't find any mechanism to tell Spring Cloud Stream to "stop" the event listening. The only other alternative I can think of is if the circuit breaker is open, to transfer these messages to an error/reprocess topic, but that sounds like an anti-pattern to me. I should be able to just pause the system from handling events, that's the whole advantage of pub/sub in a micro services app.
Any help would be appriciated.
One solution is to auto wire the application context.
#Autowired
private ConfigurableApplicationContext context;
You can stop() and start() the context.
You should not call stop() on the thread that invokes the #StreamListener though, or the stop will be delayed (because the container will wait for that thread to exit for 5 seconds by default - with a Rabbit binder at least).
Of course, you will need some kind of out-of-band mechanism to restart - perhaps JMX or a separate application context listening on some kind of control topic.