Fiware: INTEGRATING GRAFANA WITH ORION USING MONGODB - mongodb

for some time now I have been working together with my research team using different fiware components. After putting on ORION, using MongoDB for writing context data, I thought of using Grafana to show in real time data sent to ORION and stored on Mongo (For example data from a sensor). I was wondering what would be the best method to connect Grafana to ORION, considering that the free version does not offer a connector to MongoDB. Currently through a plugin, on Grafana, I can call the API exposed by the context broker to retrieve the data and then build the tables and the different charts. I wanted to know if this is a good practice or there are better ways, thanks!

Related

Two-way replication between MongoDB and DynamoDB

We certainly want to use separate databases since the front-end team finds it robust to work with MongoDB Atlas and AWS cloud architects find it easy to work with DynamoDB.
Our architecture:
Web application uses MongoDB to insert, update and retrieve data.
The MongoDB is synced in real-time with DynamoDB.
Background AWS services use DynamoDB for inserting, updating and retrieving data.
The changes in either DynamoDB or MongoDB are replicated to each other.
Tried so far:
We currently do have a sync in place with DyanmoDB streams and MongoDB atlas trigger to listen to changes on each database and forward them to the other. We use lambas for this, but our replication logic is not robust yet.
AWS Database Migration Service with ongoing replication has been suggested but haven't been able to get it to work in our use case. Perhaps, this is one option.
3rd party services like: https://www.cdata.com/sync/
Ideal Fit
The most ideal solution would be an AWS-based solution if not a reliable 3rd party service.
Greatly appreciate any resources or thoughts on this! :)

Data synchronization between primary and redundant servers

I want to synchronize data among a set of REST API servers(Spring Boot based API cluster) periodically. Any instance in the cluster should be able to broadcast new information to all others.
I don't want to use a DB here. I am trying to find a lightweight library that can be used inside the API for this purpose. Is it possible to use Atomoix/Hazelcast/ZooKeeper for this purpose? If so, it will be really helpful if someone can post a sample code - if possible.
My thanks in advance.
In Hazelcast you can do it through WAN replication.
It is an enterprise feature you have to buy a license.
Hazelcast can be used for this use-case. Each of the REST instances will create an embedded Hazelcast member within its JVM. Hazelcast members then discover each other and form the cluster. Your REST apps will use the IMap or ReplicatedMap service - a distributed key-value store (IMap can store more data, ReplicatedMap is faster). Once you write a data to the IMap all other instances see it right away.
See the code sample here: https://docs.hazelcast.com/hazelcast/latest/getting-started/get-started-java.html#complete-code-samples
This feature and the Spring integration are open-source.

Is there a connectivity possible between google firebase and AWS MongoDB

I have data getting stored in Google firebase (i.e. output of google vision API). I need the same data to get stored in MongoDB which is running on AWS. Is there a connectivity possible between Google cloud and AWS for data migration?
There is no out of the box solution for what you're trying to accomplish. Currently Firestore supports exporting it's data as documented here though the format of the export is probably not something MongoDB could import right away. Even if the format was compatible you would need some kind of data processing pipeline to handle the flow from side to side.
Depending on how you're handling the ingestion of Vision API results you might be able to include code to also send that data to MongoDB. If that's not the case you might need to design a custom solution for this particular use case.

Periodically Querying Postgresql and Visualize it On Dashboard

Here is the case, i want to visualize a query result of data taken periodically into dashboard (like grafana or kibana), the problem is I don't know which technology stack to use, should I use ELK stack, Prometheus + Grafana, or using Tableau, the requirements are:
First, it has support for multiple (hundreds) of database server as data source, currently I use Postgresql.
Second, it has support for running one query to all database instance and collect the result into one centralized server to be then displayed on dashboard.
Third, it has support for period/schedule set up (cron-like scheduler) for managing how often data should be queried from all database servers.
Fourth, it has support for alerting/notification system, where i can use existing platform library without much code needed.
Fifth, it has to be opensource project, with good reputation and quite large community support.
Thanks
You can achieve your objective with ELK Stack. In Kibana, you will see basic Dashboard. If you want more detail Dashboard view, then you can integrate Elastic Search with Grafana as well using Lucene query.Below links will help you:
https://www.elastic.co/blog/logstash-jdbc-input-plugin
https://discuss.elastic.co/t/how-can-i-schedule-logstash-every-second-for-jdbc-input-plugin/27393/11
https://grafana.com/blog/2016/03/09/how-to-effectively-use-the-elasticsearch-data-source-in-grafana-and-solutions-to-common-pitfalls/#lucene

What are some options for charting time series stored in a postgres database

I have a postgres database with a large number of time series metrics
Various operators are interested in different information and I want to provide an interface where they can chart the data, make comparisons and optionally export data as a csv.
The two solutions I have come across so far are, graphite and grafana, but both these solutions tie you down to storage engines and none support postgres.
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want. Are there any tools out there, similar to grafana, which allow you to hook up any backend you want (or even just postgres).
Note that the data I am collecting is highly sensitive, and is required by other areas of the application and so is not suitable for storing in graphite.
The other alternative I see would be to setup a trigger on the postgres DB to feed data into graphite as it arrives, but again, not ideal.
You may want to replace Graphite's storage backend with postgresql. Here is a good primer.
2018 update: Grafana now supports PostgreSQL (link).
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want
Thats possible with grafana . Check this guide which shows how to create a datasource plugin for a datasource thats currently not supported.