If I know Kibana, can I use Grafana? - visualization

I have experienced using Kibana before. However this time, I'd like to try using Grafana. Does my experience guarantee that I can learn Grafana easily? Or is it a whole lot different from Kibana?
Please correct me if I'm wrong but so far, according to my research, both are for logs. Grafana is more of visualization only, while Kibana is for searching the logs; is this right?

Grafana is a fork of Kibana but they have developed in totally different directions since 2013.
1. Logs vs Metrics
Kibana focuses more on logs and adhoc search while Grafana focuses more on creating dashboards for visualizing time series data. This means Grafana is usually used together with Time Series databases like Graphite, InfluxDB or Elasticsearch with aggregations. Kibana is usually used for searching logs.
Metric queries tend to be really fast while searching logs is slower so they are usually used for different purposes. For example, I could look at a metric for memory usage on a server over the last 3 months and get an answer nearly instantly.
Brian Brazil (Prometheus) has written about logs vs metrics.
2. Data Sources
Kibana is for ElasticSearch and the ELK stack. Grafana supports lots of data sources. Even if you are using Grafana with ElasticSearch, you would not usually look at the same data (logs) as in Kibana. It would be data from Logstash or MetricBeat that can be aggregated (grouped by) rather than raw logs.
With Grafana you can mix and match data sources in the same dashboard e.g. ElasticSearch and Splunk.
Conclusion
Kibana and Grafana have some overlap. Kibana has TimeLion for metrics and Grafana has the Table Panel for logs. Lots of companies use both - Kibana for logs and Grafana for visualizing metrics.
They are different from each other so there will be a learning curve.

Related

Fiware: INTEGRATING GRAFANA WITH ORION USING MONGODB

for some time now I have been working together with my research team using different fiware components. After putting on ORION, using MongoDB for writing context data, I thought of using Grafana to show in real time data sent to ORION and stored on Mongo (For example data from a sensor). I was wondering what would be the best method to connect Grafana to ORION, considering that the free version does not offer a connector to MongoDB. Currently through a plugin, on Grafana, I can call the API exposed by the context broker to retrieve the data and then build the tables and the different charts. I wanted to know if this is a good practice or there are better ways, thanks!

Periodically Querying Postgresql and Visualize it On Dashboard

Here is the case, i want to visualize a query result of data taken periodically into dashboard (like grafana or kibana), the problem is I don't know which technology stack to use, should I use ELK stack, Prometheus + Grafana, or using Tableau, the requirements are:
First, it has support for multiple (hundreds) of database server as data source, currently I use Postgresql.
Second, it has support for running one query to all database instance and collect the result into one centralized server to be then displayed on dashboard.
Third, it has support for period/schedule set up (cron-like scheduler) for managing how often data should be queried from all database servers.
Fourth, it has support for alerting/notification system, where i can use existing platform library without much code needed.
Fifth, it has to be opensource project, with good reputation and quite large community support.
Thanks
You can achieve your objective with ELK Stack. In Kibana, you will see basic Dashboard. If you want more detail Dashboard view, then you can integrate Elastic Search with Grafana as well using Lucene query.Below links will help you:
https://www.elastic.co/blog/logstash-jdbc-input-plugin
https://discuss.elastic.co/t/how-can-i-schedule-logstash-every-second-for-jdbc-input-plugin/27393/11
https://grafana.com/blog/2016/03/09/how-to-effectively-use-the-elasticsearch-data-source-in-grafana-and-solutions-to-common-pitfalls/#lucene

What are the disadvantages of using prometheus as business rule engine?

Is it a good idea to use prometheus as a business rule engine over drools. What are the pros and cons associated with prometheus compared to drools.
The cons of using Prometheus as a business rules engine (as I see it).
Data storage is currently ephemeral and non-durable - Long term storage of metric data is still a work in progress. Can you afford to lose the data you would be ingesting into Prometheus?
Prometheus doesn't provide a dashboard solution - Prometheus' interface is intended for ad-hoc debugging. One would typically have to use another piece of software like Grafana for data visualisation.
Logging - Prometheus is designed to collect and process metrics, not an event logging system. This may be an issue for you or the people using Prometheus when they want to track events rather than metrics.
Not designed to handle sensitive data - Prometheus was created to handle operational metrics (cpu time, number of failed http requests, latency etc.). While dropping of labels to hide sensitive data is supported with relabelling, I'd imagine most data for business rules would fall on the side of being sensitive which would require lots of relabelling that would prove time consuming to implement.
Prometheus is a tool made and used by people with an IT background - will your users who I'd assume don't have an IT background be equipped to work with such a tool?
So in theory you could use Prometheus for part of your business rule based system, but in practice you'd likely run into some if not all of the issues I have outlined above.
I'm not familiar with business rule based systems but I'd imagine they're better suited to the problem you're trying to solve than Prometheus.

What are the Metrics that Grafana can collect?

I am trying to know about GRAFANA advantages over other tools.Can anybody list what are all the metrics that GRAFANA Displays/Collects ? This helps a lot to starters.
Thanks in Advance!!!
You seem to be a little confused about what Grafana does. It is a visualization tool, and is agnostic about the types of data you're visualizing. That is what makes it much more powerful than tools that are more tightly integrated with a particular collector/database.
As long as you can get the data you want to visualize (and in 4.x alert on) into one of the many time-series databases that Grafana supports, you'll be able to visualize that data on Grafana dashboards.
If you want to get an idea of what time-series databases are supported, you can look at the docs for Graphite and the other built-in datasources http://docs.grafana.org/datasources/graphite/ (see the others in the side menu), and find more datasources supported via plugins on https://grafana.net/plugins

What are some options for charting time series stored in a postgres database

I have a postgres database with a large number of time series metrics
Various operators are interested in different information and I want to provide an interface where they can chart the data, make comparisons and optionally export data as a csv.
The two solutions I have come across so far are, graphite and grafana, but both these solutions tie you down to storage engines and none support postgres.
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want. Are there any tools out there, similar to grafana, which allow you to hook up any backend you want (or even just postgres).
Note that the data I am collecting is highly sensitive, and is required by other areas of the application and so is not suitable for storing in graphite.
The other alternative I see would be to setup a trigger on the postgres DB to feed data into graphite as it arrives, but again, not ideal.
You may want to replace Graphite's storage backend with postgresql. Here is a good primer.
2018 update: Grafana now supports PostgreSQL (link).
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want
Thats possible with grafana . Check this guide which shows how to create a datasource plugin for a datasource thats currently not supported.