How to connect Esper CEP engine with DDS - complex-event-processing

I belive I am missing something related with dds concept. My idea is to use a EsperIO adapter, data flow or plug-in to insert incomming events from dds to a esper engine, but I can't see it clear.
Somebody help!! (Thanks in advance)

The step by step would be
1) receive event data from DDS i.e. Java DataReader
2) build an event object that Esper can understand; for this use a JavaBean-style class for example
3) send event object into Esper
There is no need to build an adapter or use EsperIO. The API to feed events into Esper is really simple. For API code see http://www.espertech.com/esper/longer-case-study/ or Esper docs.

Related

Axon Framework - Initiate saga from a Non-Aggregate (eventGateway)

I am very new to axon. I have the following query. I am running 2 micro-services viz Payment and Order using Cassandra as an Event Store and Kafka . From the
payment micro-service. I am dispatching an event from eventGateway
#Component
#ProcessingGroup("OrderProcessor")
public class OrderEventHandler {
#Inject
private EventGateway eventGateway;
public void createOrder() {
OrderCreatedEvent orderCreatedEvent = new OrderCreatedEvent("orderId",
100,
"CREATED");
eventGateway.publish(orderCreatedEvent);
}
Also, I have configured the sagaStore and repository components
SagaViewRepository
#Repository
public interface SagaViewRepository extends CassandraRepository<SagaView, String> {
}
SagaStore
public CassandraSagaStore sagaStoreStore() {
return CassandraSagaStore(...);
}
How do I listen the above event (OrderCreatedEvent ) in SagaEvent Listener present in Order microservice. Below is the implementation
#Saga(sagaStore = "sagaStoreStore")
public class OrderManagementSaga {
#Inject
private transient CommandGateway commandGateway;
#StartSaga
#SagaEventHandler(associationProperty = "orderId")
public void handle(OrderCreatedEvent orderCreatedEvent){
//Saga invoked;
}
Any hints in the are much appreciated
Thank You.
In virtually any project, I would not immediately go for the microservices route.
If you are taking that route, it means you are stuck in infrastructure work, like how to send a message from one service to another, instead of providing business functionality.
That doesn't mean I would not use messaging already in your application. It is the usages of commands, events and queries which allows you to change the distance of your message buses to whatever length.
Anyhow, this is more so a recommendation than an answer to your question.
To be honest, I am unsure what you are looking for. You already stated you are using Cassandra (not supported from within Axon at any means by the way) and Kafka. That makes it so that Kafka is your means to publish events between services, right? That is what Axon provides the Kafka Extension for, for example.
Do note that taking this route will require you to define different pieces of infrastructure for command, event and query dispatching, as well as event storage. Furthermore, as already shortly pointed out, Axon doesn't view Cassandra as an optimal Event Store. If you want to know why I'd recommend you take a look at this presentation.
Instead of going for "segregated infrastructure customization
, I would recommend giving Axon Server a try. It is a one-stop-shop to distribute commands, events and queries, as well as a fully optimized event store. One thing is for certain, you wouldn't have to really think about "how to dispatch an event from your Payment service to your Order service. It would simply do it as long as your Axon application is connected to Axon Server (which is also a breeze). If you want to learn more about Axon Server, Axon's Reference Guide has a dedicated section on it you can read here.
If you feel Kafka is the way to go, that's also fine of course. It will mean more work from you or your team. So you will have to account for those manhours. For more info on how to set up Axon's Kafka Extensions for event distribution, you can check this Reference Guide page. Note that Kafka will only bring you event distribution. You are thus still left with solving the problem of command distribution, query distribution and event storage.

Example of mongo-scala-driver transaction

Mongodb 4 added multi document transaction support.
Mongo-scala-driver (http://mongodb.github.io/mongo-scala-driver/2.4/) supports mongodb 4, but I cannot find any example how to use transaction with scala.
Can anybody provide the link or code snippet?
P.S: There is synchronous transaction example in the official mongodb site, but I need example of async, non-blocking transaction in scala.
There is an example in the transaction and drivers documentation under the Scala tab.
There are a few extra caveats / gotchas for scala that are covered in the example code.
Each observable within the transaction must be passed the ClientSession
Each observable must be subscribed to in order for anything to happen (they are cold observables).
Transactions can be retried, if they meet the criteria. An example is provided in the code.
There is no Observable abstraction as of version 2.4.0 but there are plans to simplify the API in the future.

Conditional routing in Apache NiFi

I'm using NiFi to get data from an Oracle database and put some of this data in Kafka (using the processor PutKafka).
Example : if the attribute "id" contains "aaabb"
Is that possible in Apache NiFi? How can i do it?
This should definitely be possible, the flow might be something like this...
1) ExecuteSQL or QueryDatabaseTable to get the data from the database, these produce Avro
2) ConvertAvroToJson processor to convert the Avro to Json
3) EvaluateJsonPath to extract the id field into an attribute
4) RouteOnAttribute to route flow files where the id attribute contains "aaabbb"
5) PutKafka to deliver any of the matching results from RouteOnAttribute
To add on to Bryan's example flow, I wanted to point you to some great documentation that should help introduce you to Apache NiFi.
Firstly, I would suggest checking out the NiFi documentation. It is very good and should help a lot. In addition to providing details on each of the processors Bryan mentioned it also has general documentation for every type of user.
For a basic introduction to build a NiFi flow check out this video.
For example templates check out this repo. It's a has an excel file at it's root level which has a description and list of processors for each template.

CDA HL7V3 acknowledgement

I create using Mirth a channel that receives CDA messages in HL7V3 format.
I'm able to parse the message and extract all the data i need.
My question is: How do i create an acknowledgement to the receiver?
I found out that there is a message called MCCI_MT000200UV01 that i need to implement but i can't find good explanation and/or examples.
I have been working a long time with HL7V2 and the acknowledgement is very simple.
Can't find a way to implement this in HL7V3 format.
Thanks in advance for your help
I guess you are talking about a generic Accept Acknowledgment message which is MCCI_IN000002UV02 (according to the HL7v3 NE2014). If I were you, first thing to do I'd download the HL7v3 Normative Edition that matches the year of your inbound message used to transport the CDA document (unless it's HL7v2). Then I'd go to HL7v3NE > Specification Infrastructure > Messaging > Transmission Infrastructure > Generic Message Transmission and find the Accept Ack interaction. There is a related XML Schema that allows you to build an XML template for the v3 ACK (XMLSpy like tool does that by default).
Since ACKGenerator does not support HL7v3, next step is to create a code templates function that builds the v3 ACK from the template you acquired from the previous step.
(PS. The whole procedure with samples is explained in an "Unofficial Mirth Connect Developer's Guide" available at mirthconnect.shamilpublishing.com)

Apache Camel and Drools Fusion Integration

Has anyone tried integrating Apache Camel with Drools Fusion or just Drools.
Following is my use case.
Get data from an external service using REST.
Filter the data (using rules defined in Drools.)
The data from the external service could also be a stream of information (e.g., Tweeter feed, real-time location of a user)
Any help or pointers would be appreciated.
Thanks.
Drools has a camel component. Using it is not much different than using any camel component.
source: https://github.com/droolsjbpm/droolsjbpm-integration/tree/master/drools-camel
binary (in the droolsjbpm-integration bundle): http://www.jboss.org/drools/downloads.html
The only thing to be "aware" of is that Drools can treat camel messages as:
commands
regular facts
as-is objects and re-route then
Some articles:
http://blog.athico.com/search?q=camel
Documentation unfortunately only describes the "command" (1) use case:
http://docs.jboss.org/drools/release/5.4.0.Beta2/droolsjbpm-integration-docs/html/ch01.html
Some test cases you can use as examples for the use cases (2) and (3) above:
https://github.com/droolsjbpm/droolsjbpm-integration/tree/master/drools-camel/src/test/java/org/drools/camel/component
Hope this helps.