Can SPARK 3.1 push metrics on Prometheus? Is there a handler? - push

I am investigating if spark 3.1 and PPrometheus have push mechanisms between them.
I know it's possible to pull but I'd like to send the metrics from Spark to Prometheus.

Related

Official Grafana MQTT integration

I have data source with MQTT endpoint.
On Github exist repo: https://github.com/grafana/mqtt-datasource but this project is not visible on list: https://grafana.com/grafana/plugins/?utm_source=grafana_add_ds
mqtt-datasource github project had major update 2 years ago. So I have questions:
Exist any stable mqtt integration for Grafana?
What do You think, project https://github.com/grafana/mqtt-datasource is ready to be use at production? It is possible to use it with cloud version of Grafana?

Apache Kafka patch release process

How Kafka will release the patch updates?
How users will get to know Kafka patch updates?
Kafka is typically available as a zip/tar that contains the binary files which we will use to start/stop/manage Kafka. You may want to:
Subscribe to https://kafka.apache.org/downloads by generating a feed for it.
Subscribe to any feeds that give you updates
Write a script that checks for new kafka releases https://downloads.apache.org/kafka/ periodically to notify or download.
The Kafka versioning format typically is major.minor.patch release.
Every time, there is a new Kafka release, we need to download the latest zip, use the old configuration files (make changes if required) and start Kafka using new binaries. The upgrade process is fully documented in the Upgrading section at https://kafka.apache.org/documentation
For production environments, we have several options:
1. Using Managed Kafka Service (like in AWS, Azure, Confluent etc)
In this case, we need not worry about patching and security updates to Kafka because it is taken care by the service provider itself. For AWS, you will typically get notifications in the Console regarding when your Kafka update is scheduled.
It is easy to get started to use Managed Kafka service for production environments.
2. Using self-hosted kafka in Kubernetes (eg, using Strimzi)
If you are running Kafka in Kubernetes environment, you can use Strimzi operator and helm upgrade to update to the version you require. You need to update helm chart info from repository using helm repo update.
Managed services and Kubernetes operators make managing easy, however, manually managing Kafka clusters is relatively difficult.

apache ambari local repository cloudera

i have a production cluster using ambari from hortonworks. Now cloudera has blocked every access to hdp repository, because a paid support license is needed.
This hit us really hard because we have big infrastructure using ambari, kafka, ans storm.
I'm trying to build ambari from source but i think that a local hdp repository is needed.
Anyone know how to build a repo strarting from kafka and storm source?

Kafka Mirror Maker With Azure Event Hub

Working on hybrid cloud solution of data movement from on premise to cloud and want to achieve Mirror Maker from Apache kafka on premise with Azure Event Hub in cloud?
Any specific version of Kafka if it possible?

Any contrib package for Apache Beam, where I can commit a dataflow pipeline

I made this dataflow pipeline that will connect Pub/Sub to Big query. Any ideas where would be the right place to commit this upsteam in Apache Beam.