How to run Grafana with QuestDB - grafana

I've imported a demo dataset in QuestDB and I can query it successfully from the console. I'm using Grafana to build a dashboard to test visualization.
My QuestDB installation is running on port 9000 and I can import it without any issues:
curl -F data=#weather.csv http://localhost:9000/imp
I'm running the following query which is failing:
SELECT timestamp as time,
avg(visMiles) AS average_visibility
FROM 'weather.csv'
WHERE $__timeFilter(timestamp)
SAMPLE BY $__interval
LIMIT 1000
The error I get is
pq: unknown function name: between(TIMESTAMP,STRING,STRING)
I'm using a dataset provided in their examples.

QuestDB relies on a designated timestamp specified during table creation. This would not cause an error if one was provided with the curl request as a URL param, given a column named 'timestamp':
curl -F data=#weather.csv http://localhost:9000/imp?timestamp=timestamp
Another option is during a SELECT operation, a timestamp() function can specify one dynamically. If you've imported using curl and not set a designated timestamp, there are two options:
Modify your query to use timestamp() on the column you want to designate:
SELECT timestamp as time,
avg(visMiles) AS average_visibility
FROM (‘weather.csv’ timestamp(timestamp))
WHERE $__timeFilter(timestamp)
SAMPLE BY $__interval
LIMIT 1000
Create a new table which is a copy of your original dataset but designate a timestamp during creation. ORDER BY is used because the demo dataset has unordered timestamp entries:
create table temp_table as (select * from ‘weather.csv’ order by timestamp) timestamp(timestamp);
And instead of querying your original dataset, use the temp_table:
SELECT timestamp as time,
avg(visMiles) AS average_visibility
FROM temp_table
WHERE $__timeFilter(timestamp)
SAMPLE BY $__interval
LIMIT 1000
If you need more info on the use of designated timestamps, the QuestDB concepts / timestamp docs page has further details.
Edit: There are some more resources to with this topic such as a guide for Grafana with QuestDB and GitHub repo with docker-compose.

Related

Delete in Postgres citus columnar storage alternatives

I am planning to use citus to store system logs for upto n no of days after which they should be deleted.Citus columnar store looked like the perfect database for this until I read
this where its mentioned no deletes can be performed on columnar.
So my question is there an alternate way of achieving delete in the columnar store?
You can temporarily switch table access method to row mode to delete or update the table. Then after the operation you can switch back to columnar access method. An example usage is shown below:
-- create table and fill with generated data until 20 days before
CREATE TABLE logs (
id int not null,
log_date timestamp
);
-- set access method columnar
SELECT alter_table_set_access_method('logs', 'columnar');
-- fill the table with generated data which goes until 20 days before
INSERT INTO logs select i, now() - interval '1 hour' * i from generate_series(1,480) i;
-- now you want to drop last 10 days data, you can switch to row access method temporarily to execute delete or updates
SELECT alter_table_set_access_method('logs', 'heap');
DELETE FROM logs WHERE log_date < (now() - interval '10 days');
-- switch back to columnar access method
SELECT alter_table_set_access_method('logs', 'columnar');
A better alternative for log archiving: We are creating a whole copy of the source table to have a table with new access method. The bigger the table, the more resources will be consumed. A better option is that if you can divide your log table into partitions of days or months, you will only need to change access method for single partition. Note that you should set access method for each partition separately. Columnar currently do not support to set access method of partitioned table directly.
Learn more:
Citus docs
Columnar demo
Archiving logs with columnar

Convert row to column in influxDB

In my InfluxDB I want to convert the output of my query from raw to column.
The query is:
SELECT max(*) FROM table_X WHERE time > now() - 6m GROUP BY time(5m) fill(previous) ORDER BY DESC LIMIT 1
The result is:
But I want to get, for example, this output
How Can I do?
Help me please
Influx db provides the pivot function feature which will enable the converting rows to columns.
pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
https://docs.influxdata.com/influxdb/v2.0/reference/flux/stdlib/built-in/transformations/pivot/
Note: Pivot function feature available in 2.x version of the influx db flux queries. In 1.xx version of influx database doesn't support pivot.

Grafana PostgreSQL distinct on() with time series

I'm quite new to Grafana and Postgres and could use some help with this. I have a dataset in PostgreSQL with temperature forecasts. Mutiple forecasts are published at various points throughout the day (indicated by dump_date) for the same reference date. Say: at 06:00 today and at 12:00 today a forecast is published for tomorrow (where the time is indicated by start_time). Now I want to visualize the temperature forecast as a time series using Grafana. However, I only want to visualize the latest published forecast (12:00) and not both forecasts. I thought I would use DISTINCT ON() to select only the latest published forecast from this dataset, but somehow with Grafana this is not responding. My code in Grafana is as follows:
SELECT
$__time(distinct on(t_ID.start_time)),
concat('Forecast')::text as metric,
t_ID.value
FROM
forecast_table t_ID
WHERE
$__timeFilter(t_ID.start_time)
and t_ID.start_time >= (current_timestamp - interval '30 minute')
and t_ID.dump_date >= (current_timestamp - interval '30 minute')
ORDER BY
t_ID.start_time asc,
t_ID.dump_date desc
This is not working however since I get the message: 'syntax error at or near AS'. What should I do?
You are using Grafana macro $__time, so your query in the editor:
SELECT
$__time(distinct on(t_ID.start_time)),
generates SQL:
SELECT
distinct on(t_ID.start_time AS "time"),
which is incorrect SQL syntax.
I wouldn't use macro. I would write correct SQL directly, e.g.
SELECT
distinct_on(t_ID.start_time) AS "time",
Also use Generated SQL and Query inspector Grafana features for debugging and query development. Make sure that Grafana generates correct SQL for Postgres.

Converting a TEXT to dateColumn in Grafana

New to Grafana.
I have set a Postgres as a data source and am trying to create a sample time series dashboard like so...
SELECT
$__timeGroupAlias(UNIX_TIMESTAMP(start_time),$__interval),
count(events) AS "events"
FROM source_table
WHERE
$__timeFilter(UNIX_TIMESTAMP(start_time))
GROUP BY 1
ORDER BY 1
The problem is that in my table in postgres the start_time is of a type TEXT and this throws a
macro __timeGroup needs time column and interval and optional fill value
on Grafana side.
Can someone explain how can my start_time be properly converted to DateColumn so that the macros would work?
Thank you

SQL Timestamp offset (Postgres)

I want to copy records from one database to another using Pentaho. But I ran into a problem. Let's say there is a transaction_timestamp column ir Table1 with data type timestamp with time zone. But once I select records from the source DB and insert them into the other database - values of the column are offset by one hour or so. The weirdest thing - this doesn't even affect all records. I also tried something like this :
select
transaction_timestamp::timestamp without time zone as transaction_timestamp,
t1.* from
table1 t1
And it didn't work. Could the problem be that when I copy the records to the 2nd DB, it sets all values to the local timezone? But why then the select statement I mentioned doesn't work? And only a part of the records is affected?