I've set-up a divolte clickstream script on a page which gets consumed by kafka.
The setup works when I host divolte scripts using http but when I source scripts via https and set up a reverseproxy in .htaccess to send requests to a http URL, kafka doesn't consume any content.
Any ideas on how to fix this?
Related
I am using (https://pionion.github.io/) for my streaming server. To run the server I am using its docker service. And I am getting access over 'HTTP' only. Now I want my request to 'HTTPS' (ex. https://ip-address:5551)
Is there any configuration file to do so? I need help here.
The manual says this about the HTTP URL value of an http listener:
"Displays the generated HTTP URL for the HTTP Listener. This is not an actual
configurable setting, but is instead displayed for copy/paste convenience. Note
that the host in the URL will be the same as the host you used to connect to
the Administrator. The actual host that connecting clients use may be different
due to differing networking environments."
When I have used the feature in the past its value has always begun "http://localhost:" which would be great except this time it is auto-generating " http://'domainName':${Incoming_Pathology_Source_Port}/${Incoming_Pathology_Source_BaseContextPath}/"
For the first time, we are deploying Mirth inside a Kubernetes cluster, 'a different working environment'. (nginx accepts https and we want it pass the messages on as http to Mirth).
Is there any way I can take control of the URL or must I change the configuration of the cluster in some way.
All help/suggestions welcome.
I'm using Kafka connect to pull data from different places: mysql, mongodb, etc. And send to elastic search.
I would like to pull data where the origin is a webhook. So, can I configure some Kafka URL to send post http requests to it? What is a good practice to send to my Kafka http post requests?
Confluent maintains a source-available Kafka REST Proxy (be sure to read the Confluent Community License of the project). This would allow you to "send POST requests" to something that forwards this data to Kafka. Outside of this, write a simple HTTP endpoint on your own that does the same.
Personally, I have used Nifi's ListenHTTP handler to accept webhooks, then parse, route, filter, etc into a ProduceKafka request
Otherwise "pulling data" isn't a web hook, and there are a handful of "Kafka Connect http" source projects on Github.
I use Confluent Kafka REST Proxy to send messages to Apache Kafka.
I set up basic authentication on the REST Proxy and whenever I submit a HTTP request to the proxy, I get the 403 HTTP Error !role.
The proxy requires Zookeeper, Kafka and Schema Registry to be running. I didn't configure any security on these services.
Without authentication, the proxy works and delivers messages to Kafka successfully.
How to I troubleshoot this problem? I spent multiple hours on that problem and I still can't fix it.
Check following:
Firewall allow the service or port
Is there any antivirus block the service or port
Rights given on kafka, confluent folder & respective log directory to kafka user.
I'm working on a Windows8.1 Store app using JavaScript.
As per the requirement we are serving the html files with the help of a http local server which uses Windows.Networking.Sockets.StreamSocket.Listener.
Here is the example of the http server I'm using and its working fine for http requests.
If I change the source to https the local server receives request, but the InputStream will be encrypted.
Is there way to decrypt and get the InputStream? May be by using any SSL certificates.
The problem is x-ms-webview in windows 8.1 app require https source to perform script notification i.e MSWebViewScriptNotify event.
As we are not using any external website to make it https, how to make this local server to accept https requests and serve the files.