SalesForce's URL for Kafka provider - apache-kafka

I'm trying to determine the URL for SalesForce as it relates to setting up a Kafka Provider. I'm using Bayeux client that needs a URL to SalesForce for connection:
new KafkaOptions(new Uri(""));
Thanks.

It seems you are using this Kafka library, which has nothing to do with Salesforce, but you give it a string of your Kafka broker addresses, as shown in the examples there.
Note: That library doesn't look actively maintained and Kafka is not an http protocol so putting http://server:9092 doesn't make sense...
You might want to checkout confluent-kafka-dotnet instead.

Related

Publish to Apache Kafka topic from Angular front end

I need to create a solution that receives events from web/desktop application that runs on kiosks. There are hundreds of kiosks spread across the country and each one generate time to time automatic events and events when something happens.
Despite this application is a locked desktop application it is built in Angular v8. I mean, it runs in a webview.
I was researching for scalable but reliable solutions and found Apache Kafka seems to be a great solution. I know there are clients for NodeJS but couldn't find any option for Angular. Angular runs on browser, for this reason, it must communicate to backend through HTTP/S.
In the end, I realized the best way to send events from Angular is to create a API that just gets message from a HTTP/S endpoint and publishes to Kafka topic. Or, is there any adapter for Kafka that exposes topics as REST?
I suppose this approach is way faster than store message in database. Is this statement correct?
Thanks in advance.
this approach is way faster than store message in database. Is this statement correct?
It can be slower. Kafka is asynchronous, so don't expect to get a response in the same time-period you could perform a database read/write. (Again, would require some API, and also, largely depends on the database used)
is there any adapter for Kafka that exposes topics as REST?
Yes, the Confluent REST Proxy is an Apache2 licensed product.
There is also a project divolte/divolte-collector for collecting click-data and other browser-driven events.
Otherwise, as you've discovered, create your own API in any language you are comfortable with, and have it use a Kafka producer client.

Integrating a MQTT broker inside server

I am learning about MQTT brokers, and I have got a question I cannot answer. Is it possible integrate a MQTT broker inside a server that acts as a client in a client/server architecture? - The reason I would need that is in case that this client retrieves data from an API.
I have tried to depict what I mean. If it is not possible, how would one approach it then, in case the data from the API is needed?
There no reason for the broker to be part of the client.
The client receives the data and then publishes it as a message to a separate broker where subscribers receive the message. There is no benefit to combing the two.
Building adapters like this is common practice (it's one of the reasons tools like Node-RED were created)

Best way to write to Kafka from web site?

I mean I know how to get data into kafka either by some file agent or programmatically using any of the clients, but speaking from architectural point of view...
It can't just be collecting HTTP logs.
I'm assuming when someone clicks a link or does something of interest, we can use some kind of ajax/javascript call to make a call to some microservice to capture the extra info that we want? But that's not always "reliable" per say, but do we care?
Or while the given "action" posts back to the server we simultaneously write to Kafka and perform the other action?
It’s not clear from your question if you are trying to collect all the clickstream logs from a set of web servers, or if you are trying to selective publish some data to Kafka from your web app, so I will answer both.
The easiest way to collect every web click is to configure your web servers to use Syslog ( see http://archive.oreilly.com/pub/a/sysadmin/2006/10/12/httpd-syslog.html ) and configure your Syslog server to send data to Kafka (see https://www.balabit.com/documents/syslog-ng-ose-latest-guides/en/syslog-ng-ose-guide-admin/html/configuring-destinations-kafka.html). Alternatively there are some more advanced features available in this Kafka Connector for Syslog-NG (see https://github.com/jcustenborder/kafka-connect-syslog). You can also write httpd logs to a file and use a Kafka File Connector to publish to Kafka (see https://docs.confluent.io/current/connect/connect-filestream/filestream_connector.html)
If you just want to enable your apps to send certain log data to Kafka directly you can use the Kafka REST Proxy and publish using a simple HTTP POST from either your client JavaScript or your server side logic (see https://docs.confluent.io/current/kafka-rest/docs/index.html)

How to process the webservice xml message in mirth

How to process the webservice XML message in Mirth Connect 3.x?
If I understand your question correct, you are asking for how to configure Mirth to become a Web server. It's actually easy and hard at the same time.
The easy way - create a new channel and configure the Source connector as Web Service Listener. Deploy the channel and you have a web server waiting for SOAP messages to be sent to a configured IP port. But the structure of these SOAP messages is governed by Mirth WSDL at localhost:8081/services/Mirth?wsdl.
If you want the SOAP message structure to be different then you are going to deep dive into creating your own Java class and overriding default web service methods. There is no a single answer for that, it is a completely separate topic.
I hope you are asking how to consume XML webservice message in Mirth?..
If you are receiving specifically SOAP you need to set webservice listener as your source channel listener. (as said previous answer, you will have the URL)
Go to your transformer and type the following code:
logger.info(connectorMessage.getRawData());
Once you do this you can see the data you received inside Mirth on the logger area.

can Financial Information eXchange connect to any broker

Can FIX (Financial Information eXchange) be used to connect to any broker or can it only be used to connect to brokers that offer a FIX API?
FIX can only be used to connect to a broker that offers a FIX API. The majority of brokers should offer a FIX API, but some will have only a proprietary API.
can it only be used to connect to brokers that offer a FIX API
No you don't need a broker's API to connect to the broker. You can you use your own API if you want or any of the commercial APIs available. But before using an API you have to decide on a FIX spec which determines the message structures you will be exchanging, the version and of course the technical details of the server, ports etc. And every broker would be able to handle FIX messages, as FIX is the industry standard messaging, which all financial bodies use.