How to create a backend Grafana plugin for annotations? - grafana

I understand how to use Grafana's Hashicorp Go-plugin system in order to create a generic time-series datasource plugin.
How can I make a backend plugin that could be used for annotations? The only official example provided is using fake-json-datasource for annotations which is a separate service and not a plugin.
I've found an example of built-in datasource that's providing annotations - https://github.com/grafana/grafana/blob/master/pkg/tsdb/stackdriver/stackdriver.go#L78-L79. However I'm not sure how to make Grafana do annotations queries to a backend plugin which is not built-in.

Related

Custom MLFlow scoring_server for model serving

I would like to know if MLflow currently does support any kind of customization of it's scoring_serving that would allow the ability to register new endpoints to the published Rest API.
By default the scoring server provides /ping and /invocations endpoint, but i would like to include more endpoints in addition to those.
I've seen some resources that allow that kind of behaviour using custom WSGI implementations but i would like to know if extension of the provided mlflow scoring_server is possible in any way, so the default supporty provided by mlflow generated docker images and the deployment management is not lost.
I explored existing official and unofficial documentation, and explored existing github issues and the mlflow codebase in it's github repository.
Also i've explored some alternatives such as using custom WSGI server configuration for starting the Rest API.
Any kind of resource/documentation is greatly appreciated.
Thanks in advance.

Springdoc extension for AWS API Gatewary

I want to migrate from springfox 2 to springdoc. Currently, there are multiple plugins (springfox.documentation.spi.service.OperationBuilderPlugin) implemented using the support available in the springfox library.
The plugin of AWS API Gateway is one of them (similar to this one http://springfox.github.io/springfox/docs/current/#example-operationbuilderplugin).
I didn't found relevant examples or detailed documentation of how I can build something like this using the springdoc support.
Would appreciate any suggestion!
There is no such extension available in springdoc-openapi.
You have an example with spring-cloud-gateway that you can adapt.

automatic pulling REST API data to visualize it in Apache Superset

I work in a large enterprise and have a project to build some custom automated dashboards for our IT department, the small amount of data needs to be fetched only from the REST API endpoints. This process needs to be fully automated and there is not enough time to build a custom API wrapper. For this approach I was going to use Apache Airflow + Apache Superset tools. I have been googling for a couple of days for more easier open source solution than the Apache Airflow to move data from the REST API endpoints to visualize it in Superset. Please share your experience what would you choose instead of the Apache Airflow?
I chose to go with fhe following solution:
Apache Airflow + PostgreSQL + Grafana (instead of a Superset, because in Grafana you can actually create a drill-down option using a workaround)

Where are the ElasticSearch APIs exposed when running Crate?

I've successfully installed the elasticsearch head plugin on crate and can access its web UI but it fails to connect. I'd like to be able to use it to visualize the data in the underlying elasticsearch store. Is there a a way to access the elasticsearch API directly so that head can work?
You will need to enable the API which is done inside the crate.yml file. And the setting to change is:
es.api.enabled: true
However, Elasticsearch Plugins may not work out of the box because Crate and Elasticsearch aren't binary compatible (you will probably have to modify the namespaces and imports). Elasticsearch has a shading step in their maven configuration so the elasticsearch jar contains different namespaces then Crate does (because Crate doesn't use shading).

ArangoDB and Gephi: Import data from ArangoDB into Gephi

Is there a way to use ArangoDB as datasource for gephi? I tried https://github.com/datablend/gephi-blueprints-plugin/wiki . But it is only working with the indirection over rexster with included plugin blueprints-arangodb-graph.
I think this is very inelegant option with a lot overhead.
I wish some way, that I'm able to add a blueprints arango db plugin to gephi and then I'm able to choose a ArangoDB as datasource. Maybe in combination with gephi-blueprints-plugin.
I think a combination of the blueprints plugin for gephi and the arangodb blueprints API would be the nicest solution to avoid an additional step over a csv (or other) file to work with data from ArangoDB in Gephi.
The description of the blueprints plugin for gephi says: "The Gephi Blueprints plugin allows a user to import graph-data from any graph database that implements the Tinkerpop Blueprints generic graph API". But I don't know how - it supports out of the box only TinkerGraph, Neo4j, OrientDB, Dex, RexterGraph and FluxGraph.
I tried to create a arrangodb.xml in /etc/graph and add the blueprints implementation of arrangodb as jar in the ".gephi/dev/modules" folder. But gephi doesn't load the jar and so the menu entry "Import/Graph database ..." and the selection of "arangodb" leads to a null pointer error because of the missing class file the arangodb blueprints api.
Has someone worked with gephi-blueprints-plugin and/or blueprints-arangodb-graph and has some ideas?
This was discussed on this ticket of the Blueprints Adapter of ArangoDB. A plugin for Gephi has to be build specifically for ArangoDB. Axel Hoffmann is thinking about developing this plugin.