discarding milliseconds from the average time calculated - postgresql

Here is my code:
SELECT cluster_id, AVG(min_per_imei_cluster::TIME) as avg_time
FROM arrival
group by cluster_id
The avg_time column gives values with milliseconds and beyond. How do I truncate or round to the nearest second?
I tried using
AVG(min_per_imei_cluster::TIME)::timestamp(0)
but got the following error:
SQL Error [42846]: ERROR: cannot cast type interval to timestamp without time zone

It looks like you can use the date_trunc function to specify the level of precision that you want. So in this instance, you would do date_trunc('second',min_per_imei_cluster).

Related

How to declare interval infinity in postgres?

It seems that we can't currently case infinity as an interval.
While trying:
SELECT 'infinity'::interval;`
we get
SQL Error [22007]: ERROR: invalid input syntax for type interval: "infinity"
How could I specify a max value for an interval?
I tried comparing 2 infinity timestamps
SELECT ('-infinity'::timestamp + '1 day'::INTERVAL)::timestamp without time zone at time zone 'UTC'
- 'infinity'::timestamp without time zone at time zone 'UTC';
but now getting
SQL Error [22008]: ERROR: cannot subtract infinite timestamps
Any idea?
Yes, that is true. You cannot represent infinite intervals with the interval data type.

Find days between dates

I am looking to subtract 2 dates to get the number of days in between. However, the columns are defined as "timestamp without time zone". I'm not sure how to get a whole integer.
I have a stored procedure with this code:
v_days_between_changes := DATE_PATH('day',CURRENT_DATE - v_prev_rec.date_updated)::integer;
But I get this error:
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
QUERY: SELECT DATE_PATH('day',CURRENT_DATE - v_prev_rec.date_updated)::integer
Any help would be great.
You can compute the difference between the dates, which returns an interval. Then, extract the number of days from this interval.
WITH src(dt1,dt2) AS (select '1/1/2019'::timestamp without time zone, CURRENT_DATE )
select EXTRACT(day FROM dt2-dt1) nbdays
from src;
nbdays
-----------
98

Getting error when converting timestamp column to time

I am getting the following error message, when trying to convert timestamp column to time.
Error message:
SQL Error [500310] [0A000]: [Amazon](500310) Invalid operation: Specified types or functions (one per INFO message) not supported on Redshift tables.
Here is my sql snippets I have been using to convert the column over, but all give me the above error:
SELECT pg_catalog.time(timestamp) AS myTime from tableA.time
SELECT "time"(timestamp) AS myTime from tableA.time
SELECT to_char(cast(timestamp as "time")) from tableA.time
select timestamp::time from tableA.time
What has worked for me but what I do not need, since I need to run mathematical operations on the time, is below the sql scripts.
SELECT
EXTRACT(HOUR FROM timestamp) AS HOUR,
EXTRACT(MINUTE FROM timestamp) AS MINUTE,
EXTRACT(SECOND FROM timestamp) AS SECOND
FROM tableA.time;
is there any way I could check the properties of the column to see if the reason the column is not converting to time? I have tried simple lines like this (select '2016-10-01 12:12:12'::time) that work perfectly fine not sure why the whole column is not converting over. If anyone could provide me some help that would be great!
thank you in advance.
-EDIT-
I looked at the column with the timestamp and the type of the column is "timestamp without time zone". Maybe that is what is causing the error.

Convert bigint data type to timestamp (and subsequently to date) in redshift

I need to convert the value stored in a bigint column to a date field. The first step of the conversion involves converting it to timestamp, and subsequently use the TRUNC method to convert this column to a date value.
However, my query is failing while converting the bigint value to timestamp.
The error that I'm getting is:-
Amazon Invalid operation: cannot cast type bigint to timestamp without time zone;
The query I'm trying for now is something like this:-
select ts::timestamp from events limit 1;
I was able to avoid the time zone error by using the method described in this thread: https://stackoverflow.com/a/36399361
My dates are based on epochs, and I was able to do the following:
SELECT
(TIMESTAMP 'epoch' + contract_start_date * INTERVAL '1 Second ')
FROM
table_name
SELECT TIMESTAMP 'epoch' + {column of bigint}/1000 * INTERVAL '1 second' as adate FROM tbl
If you are starting with a POSIX timestamp, and trying to get a timezone aware datetime value, you will need to supply a timezone - even if you later want to truncate the time part away. I'm not familiar with redshift, but perhaps there is a way to specify you mean UTC.

Postgresql - Avg just minutes and seconds?

Im running an aggregate query that's returning just two columns, a name and timez.
timez is currently in the format of MI:SS, stored as varchar. In my query, I want to get the average minutes and seconds of time, but of course if I cast the column to a timestamp, the avg() function doesn't work on it. I tried dividing the timestamps by count(1) which doesn't work either as timestamps can't be divided. Below is what I ideally wanted.
SELECT name, avg(to_timestamp(timez,'MI:SS'))
FROM logs_table
GROUP BY name
Just cast the timez column to a time type before applying avg:
SELECT name, avg(CAST(timez AS time))
FROM logs_table
GROUP BY name;