Redshift with Grafana - ERROR: This type of correlated subquery pattern is not supported due to internal error - amazon-redshift

I am facing with a query done on Redshift from Grafana where on adding a specific where clause, the above mentioned error is coming. Without this where clause, it is doing fine. Also, if we put a direct value say where first_week > 2, then no error is coming.
where clause:
WHERE first_week >= EXTRACT(YEAR FROM $__timeFrom AT TIME ZONE 'UTC')
from
(select a.distinct_id, a.login_week, b.first_week as first_week, a.login_week
first_week as week_number
from (select distinct_id, EXTRACT(WEEK FROM timestamp AT TIME ZONE 'UTC') AS login_week from posthog_event where distinct_id IN ( select distinct_id from activated_user) group by 1, 2) a,
select distinct_id, MIN(EXTRACT(WEEK FROM timestamp AT TIME ZONE 'UTC')) AS first_week from posthog_event where distinct_id IN ( select distinct_id from activated_user) group by 1) b
where a.distinct_id = b.distinct_id
) as with_week_number
where first_week>= EXTRACT(YEAR FROM '2022-03-01T17:01:18Z' AT TIME ZONE 'UTC')
group by first_week order by first_week
Any idea where I am going wrong ? Or what could be done to get the where clause added.

https://grafana.com/grafana/plugins/grafana-redshift-datasource/
$__timeFrom() outputs the current starting time of the range of the panel with quotes
=> Grafana macro is $__timeFrom() and not only $__timeFrom, so correct condition:
WHERE first_week >= EXTRACT(YEAR FROM $__timeFrom() AT TIME ZONE 'UTC')
As usual: doc is your good friend

Related

Postgresql Newbie - Looking for insight

I am in an introduction to sql class (using postgresql) and struggling to take simple queries to the next step. I have a single table with two datetime columns (start_time & end_time) that I want to extract as two date only columns. I figured out how to extract just the date from datetime using the following:
Select start_time,
CAST(start_time as date) as Start_Date
from [table];
or
Select end_time,
CAST(end_time as date) as End_Date
from [table];
Problem: I can't figure out the next step to combine both of these queries into a single step. I tried using WHERE but i am still doing something wrong.
1st wrong example
SELECT start_time, end_time
From baywheels_2017
WHERE
CAST(start_time AS DATE) AS Start_Date
AND (CAST(end_time AS DATE) AS End_Date);
Any help is greatly appreciated. Thanks for taking the time to look.
You don't need to select the underlying field in order to later cast it; each field in the "select" clause is relatively independent. With the table created by:
CREATE TABLE test (
id SERIAL PRIMARY KEY,
start_time TIMESTAMP WITH TIME ZONE NOT NULL,
end_time TIMESTAMP WITH TIME ZONE NOT NULL
);
INSERT INTO test(start_time, end_time)
VALUES ('2022-10-31T12:30:00Z', '2022-12-31T23:59:59Z');
You could run the select:
SELECT
cast(start_time as date) as start_date,
cast(end_time as date) as end_date
FROM test;
(You can try this out on a website like DB-Fiddle.)

How to group by month off a timestamp field from Redshift in Superset

I am trying to show some trend over month in Superset from a table which has a timestamp field called created_at but have no idea how to get it right.
The SQL query generated from this is the followings:
SELECT
DATE_TRUNC('month', created_at) AT TIME ZONE 'UTC' AS __timestamp,
SUM(cost) AS "SUM(cost)"
FROM xxxx_from_redshift
WHERE created_at >= '2017-01-01 00:00:00'
AND created_at <= '2018-07-25 20:42:13'
GROUP BY DATE_TRUNC('month', created_at) AT TIME ZONE 'UTC'
ORDER BY "SUM(cost)" DESC
LIMIT 50000;
Like I mentioned above, I don't know how to make this work and 2nd question is why ORDER BY is using SUM(cost)? If this is a time-series, shouldn't it use ORDER BY 1 instead? I tried to change Sort By but to no avail.
It is quite silly but I found that SUM(cost) doesn't work while sum(cost) works. It is a bug in Superset and will be addressed in https://github.com/apache/incubator-superset/pull/5487

Column of type "timestamp(6) with timezone" and "current time" difference in minutes

I have Oracle 12c DB table and one of it's column utc_timestamp is of type
UTC_TIMESTAMP TIMESTAMP(6) WITH TIME ZONE
It stores timestamp in UTC while current_timestamp and systimestamp both gives timestamp in different timezones.
How can I get time difference in MAX(utc_timestamp) and current_timestamp in minutes ignoring time difference due to different time zones.
For example:
select current_timestamp from dual;
Gives=> 23-AUG-17 04.43.16.253931000 PM AMERICA/CHICAGO
select systimestamp from dual;
Gives=> 23-AUG-17 05.43.16.253925000 PM -04:00
select max(UTC_TIMESTAMP) from table_name;
Gives=> 23-AUG-17 09.40.02.000000000 PM +00:00
For above condition when I run SQL to check time difference between in MAX(utc_timestamp) and current_timestamp I should get number 3.
I think I need something like:
select (extract(minute from current_timestamp) - extract(minute from max(UTC_TIMESTAMP)) * 1440) AS minutesBetween from table_name;
But different timezones are messing it up and I get negative number like -4317. This might be correct as current_timestamp will be higher than max(utc_timestamp) being in CST. So I tried:
select (extract(minute from CAST(current_timestamp as TIMESTAMP(6) WITH TIME ZONE)) - extract(minute from max(UTC_TIMESTAMP)) * 1440) AS minutesBetween from table_name;
This SQL runs without error but producing a big negative number like -83461. Please help me find what am I doing wrong.
You really have two problems here.
One is to convert CURRENT_TIMESTAMP to UTC. That is trivial:
select CURRENT_TIMESTAMP AT TIME ZONE 'UTC' from dual [.....]
(use the AT TIME ZONE clause https://docs.oracle.com/cd/B19306_01/server.102/b14225/ch4datetime.htm#i1007699)
The other is that the difference between two timestamps is an interval, not a number.
select current_timestamp at time zone 'UTC'
- to_timestamp_tz('24-AUG-17 04.00.00.000 AM UTC', 'dd-MON-yy hh.mi.ss.ff AM TZR')
from dual;
produces something like
+00 00:02:39.366000
which means + (positive difference) 00 days, 00 hours, 02 minutes, 39.366 seconds.
If you just want the minutes (always rounded down), you may wrap this whole expression within extract( minute from < ...... > ). Be aware though that the answer will still be 2 (minutes) even if the difference is five hours and two minutes. It is probably best to leave the result in interval data type, unless you are 100% sure (or more) that the result is always less than 1 hour.

How to Extract Year from DATE in POSTGRESQL

Date is in 'YYYY-MM-DD' text format, now I need to extract the year part which must be in numeric. I need this conversion to be done in single step, Since I need to use in other application where i cannot create new variable.
TO_DATE(t0.AESTDTC,'YYYY-MM-DD'),'YYYY-MM-DD' with this i was able to convert to date but Now i need to Extract the year from this date in single step? can any one help me?
Try
select date_part('year', your_column) from your_table;
or
select extract(year from your_column) from your_table;
This line solved my same problem in postgresql:
SELECT DATE_PART('year', column_name::date) from tableName;
If you want month, then simply replacing year with month solves that as well and likewise.
answer is;
select date_part('year', timestamp '2001-02-16 20:38:40') as year,
date_part('month', timestamp '2001-02-16 20:38:40') as month,
date_part('day', timestamp '2001-02-16 20:38:40') as day,
date_part('hour', timestamp '2001-02-16 20:38:40') as hour,
date_part('minute', timestamp '2001-02-16 20:38:40') as minute
Choose one from, where :my_date is a string input parameter of yyyy-MM-dd format:
SELECT EXTRACT(YEAR FROM CAST(:my_date AS DATE));
or
SELECT DATE_PART('year', CAST(:my_date AS DATE));
Better use CAST than :: as there may be conflicts with input parameters.
You may try to_char(now()::date, 'yyyy')
If text, you've to cast your text to date to_char('2018-01-01'::date, 'yyyy')
See the PostgreSQL Documentation Data Type Formatting Functions
SELECT TO_CHAR(CURRENT_DATE, 'YYYY')

Date comparison in postgresql not working

I have a problem with a query in my postresql database. I'm trying to get the last 100 logs after a certain date-time, but when I use this query:
select * from log_entry WHERE array['MESSAGE'] && tags AND CAST(last_updated AS DATE) >= '2013-02-28T16:47:26.394213' ORDER BY last_updated DESC LIMIT 100
Here the column last_updated is of type: timestamp without time zone
Sometimes I get logs from before '2013-02-28T16:47:26.394213', am I doing something wrong? Is there a better way to do this instead of using cast?
Thanks in advance!
(Reposted from a comment by request.)
Well, there's your problem: when you cast a timestamp (with or without a time zone) to a date, you truncate the time portion of the timestamp. Why not cast '2013-02-28T16:47:26.394213' to a timestamp and compare it directly with last_updated?
select * from log_entry
WHERE array['MESSAGE'] && tags
AND last_updated
>= '2013-02-28T16:47:26.394213'::timestamp without time zone
ORDER BY last_updated DESC LIMIT 100
use this .This should work .
select * from log_entry
WHERE array['MESSAGE'] && tags
AND age(last_updated)
= age(now())
ORDER BY last_updated DESC LIMIT 100