How to set metrics "HLLSketchMerge" in apache superset with druid - druid

Seems like superset support only closed list of aggregation function, what can I do if I want to use HLL metrics, like HLLSketchMerge?

This can be done in the data source configuration level.

Related

Is it possible to use grafana to write query results of SQL DBs (postgres / mysql) into influxDB ?

I would like to query several different DB's using grafana, and in order to keep metrics history I would like to keep it in influxDB.
I know that I can write my own little process that holds queries and send it to influx, but I wonder if its possible by grafana only?
You won't be able to use Grafana to do that. Grafana isn't really an appropriate tool for transforming/writing data. But either way, its query engine generally just works with one single datasource/database at a time, rather than multiple, which is what you'd need here.

Is Grafana used for analyzing system metrics alone?

I am new to grafana. I want to know whether grafana is used for only monitoring system metrics?
1) If not so, I am having postgreSQL database with some live data in it. Can i use the grafana for accessing those postgres tables directly into grafana without any conversion like json.
2) If there is possibility to directly access postgres databse into grafana which data source can i use?
Please correct me if I am wrong..
Grafana can be used to visualize any time-series or metrics and not just system metrics.
PostgreSQL can be used using a datasource plugin - https://github.com/sraoss/grafana-sqldb-datasource (haven't tried it out myself)
And there's a generic SQL Datasource being developed as well. Here's the PR for your reference. - https://github.com/grafana/grafana/pull/5364
I want to know whether grafana is used for only monitoring system
metrics?
You can use grafana to display a lot of different metrics. I for example use grafana + influxdb to display different sensor values from my apartment.
Can i use the grafana for accessing those postgres tables directly into grafana
I am not sure about that. But if you take a look at the available data-sources LINK you will see that there is no PostgreSQL. So I think this is a no.

Feed data to graylog2 from MySql tables

I am looking for a way to get a data from few specific MySql tables into graylog2. I have done something similar in ELK using the logstash JDBC input plugin as below,
https://www.elastic.co/blog/logstash-jdbc-input-plugin
Is there a similar way or better way to do it via graylog2
There is a plugin to generate messages in GELF format from Logstash, if you want to use Logstash to output events to Graylog2.

From PostgreSQL to Cassandra - Aggregation functions not supported

I need your advise please. I have an application that runs on PostgreSQL but takes too long to bring back data.
I would like to use Cassandra but noticed that CQL does not support aggregation.
Would that be possible with Hadoop or am I going completely the wrong way?
Also all the dates are stored in Epoch, and CQL can't convert them.
What would be the best approach to convert an application that runs on PostGreSQL to Cassandra?
Thank you for any suggestions.
Cassandra introduced aggregate functions in 2.2 with CASSANDRA-4914. The documentation for using the standard (built in) functions is here and for creating custom aggregate functions is here.

HBase and elasticsearch integration like MongoDB river

I am kinda new to both elasticsearch and HBase but for a research project I would like to combine the two. My research project mainly involves searching through large portion of documents (doc,pdf,msg etc) and extracting named entities from the documents through
mapreduce jobs running on the documents stored in HBase.
Does any one know if there is something similar to MongoDB river plugin for HBase? Or can point me to some documentation about integrating ElasticSearch and Hbase? I have looked on the internet for any documentation but unfortunately without any luck.
Kind regards,
Martijn
I don't know of any elasticsearch hbase integrations but there are a few Solr and HBase integrations that you can use like Lily and SolBase
Tell me what you think about this https://github.com/posix4e/Elasticsearch-HBase-River. It uses hbase log shipping to reliably handle updates and deletes from hbase into an elastic search cluster. It could easily be extended to do n regionserver to m elastic search server replication.
you can use phoenix jdbc driver + es jdbc river as shown here: http://lessc0de.github.io/connecting_hbase_to_elasticsearch.html
I don't know of any packaged solutions, but as long as your mapreduce preps the data in the right way, it should be fairly easy to write a simple batch job in the programming language of your choice that reads from HBase and submits to ElasticSearch.
take a look to this page (3 years later) : http://lessc0de.github.io/connecting_hbase_to_elasticsearch.html