How to convert BIGINT Timestamp to a Datetime in dbeaver - amazon-redshift

I connected my redshift to dbeaver and while running select * on my table i got in the time column - 1541079087394 which is bigint. how can i shel run the query in order to get time stamp with date and time like in kibana - May 20th 2019, 10:49:16.949.

Time columns are not bigint, however you probably can convert that integer to a timestamp using this code method
select timestamp 'epoch' + your_bigint_col/1000 * interval '1 second' AS your_column_alias
from your_table
This is assuming that your bigint is epoch, you didn't say.

Related

Can't extract date from milliseconds epoch postgresql

I'm querying the database (RedShift) and I have a piece of information stored in epoch MS format. Envision a table along the lines of:
Purchase, date
1, 1620140227019
2, 1620140227045
3, 1620140226573
I need to convert the timestamp to a readable date but I can't make it work with to_timestamp() or extract(). The problem is first with the size of the value (13 digits are not supported).
The closest solution I have is
select to_timestamp(1620140226573/1000, 'SS')
But the result is 0051-05-04 14:57:06. In other words month, date and seconds are correct but the year is wrong.
You can run this query
select to_timestamp(round(1620140227254/1000))
The solution was in the documentation: https://docs.aws.amazon.com/redshift/latest/dg/r_Dateparts_for_datetime_functions.html
SELECT timestamp with time zone 'epoch' + 1620140227019/1000 * interval '1 second' AS converted_timestamp
or
select '1970-01-01'::date + 1620140227019/1000 * interval '1 second'

Column of type "timestamp(6) with timezone" and "current time" difference in minutes

I have Oracle 12c DB table and one of it's column utc_timestamp is of type
UTC_TIMESTAMP TIMESTAMP(6) WITH TIME ZONE
It stores timestamp in UTC while current_timestamp and systimestamp both gives timestamp in different timezones.
How can I get time difference in MAX(utc_timestamp) and current_timestamp in minutes ignoring time difference due to different time zones.
For example:
select current_timestamp from dual;
Gives=> 23-AUG-17 04.43.16.253931000 PM AMERICA/CHICAGO
select systimestamp from dual;
Gives=> 23-AUG-17 05.43.16.253925000 PM -04:00
select max(UTC_TIMESTAMP) from table_name;
Gives=> 23-AUG-17 09.40.02.000000000 PM +00:00
For above condition when I run SQL to check time difference between in MAX(utc_timestamp) and current_timestamp I should get number 3.
I think I need something like:
select (extract(minute from current_timestamp) - extract(minute from max(UTC_TIMESTAMP)) * 1440) AS minutesBetween from table_name;
But different timezones are messing it up and I get negative number like -4317. This might be correct as current_timestamp will be higher than max(utc_timestamp) being in CST. So I tried:
select (extract(minute from CAST(current_timestamp as TIMESTAMP(6) WITH TIME ZONE)) - extract(minute from max(UTC_TIMESTAMP)) * 1440) AS minutesBetween from table_name;
This SQL runs without error but producing a big negative number like -83461. Please help me find what am I doing wrong.
You really have two problems here.
One is to convert CURRENT_TIMESTAMP to UTC. That is trivial:
select CURRENT_TIMESTAMP AT TIME ZONE 'UTC' from dual [.....]
(use the AT TIME ZONE clause https://docs.oracle.com/cd/B19306_01/server.102/b14225/ch4datetime.htm#i1007699)
The other is that the difference between two timestamps is an interval, not a number.
select current_timestamp at time zone 'UTC'
- to_timestamp_tz('24-AUG-17 04.00.00.000 AM UTC', 'dd-MON-yy hh.mi.ss.ff AM TZR')
from dual;
produces something like
+00 00:02:39.366000
which means + (positive difference) 00 days, 00 hours, 02 minutes, 39.366 seconds.
If you just want the minutes (always rounded down), you may wrap this whole expression within extract( minute from < ...... > ). Be aware though that the answer will still be 2 (minutes) even if the difference is five hours and two minutes. It is probably best to leave the result in interval data type, unless you are 100% sure (or more) that the result is always less than 1 hour.

Choosing between INTEGER and TIMESTAMP data type

For the purpose of persisting date & time values, I can naturally use the TIMESTAMP data type in Redshift. However I am leaning towards just relying on INTEGER timestamps. I can as well make use of INTEGER instead of native TIMESTAMP data type, for example, I can easily convert them as SELECT TIMESTAMP 'epoch' + intTime * INTERVAL '1 second' from table_name;. Since I only need resolution to the second, INTEGER should be alright.
One advantage, INTEGER data type is 4 bytes and TIMESTAMP data type is 8 bytes, so I suppose I can save some disk space and I/O time. Secondly, I just want to keep Redshift in sync with dynamodb database as I use Number data type for date & time in dynamodb.
In Redshift, date & time field (intTime) is going to be part of the compound sort key and I will heavily rely on filtering, sorting and grouping based on date and time. Will there be any performance penalties if I use INTEGER instead of TIMESTAMP when it comes to sorting, filtering and grouping as INTEGER mostly need to be in a EXTRACT function considering it is going to be part of the compound sort key. Please look at the query below for reference.
SELECT EXTRACT(doy from TIMESTAMP 'epoch' + intTime * INTERVAL '1 second'), count(*) AS count
FROM table_name
WHERE TIMESTAMP 'epoch' + intTime * INTERVAL '1 second' BETWEEN '2016-12-15 00:00:00' AND '2016-12-15 23:59:59'
AND EXTRACT(dow FROM TIMESTAMP 'epoch' + intTime * INTERVAL '1 second') IN (5, 6)
GROUP BY EXTRACT(doy FROM TIMESTAMP 'epoch' + intTime * INTERVAL '1 second')
ORDER BY count DESC;

Convert bigint data type to timestamp (and subsequently to date) in redshift

I need to convert the value stored in a bigint column to a date field. The first step of the conversion involves converting it to timestamp, and subsequently use the TRUNC method to convert this column to a date value.
However, my query is failing while converting the bigint value to timestamp.
The error that I'm getting is:-
Amazon Invalid operation: cannot cast type bigint to timestamp without time zone;
The query I'm trying for now is something like this:-
select ts::timestamp from events limit 1;
I was able to avoid the time zone error by using the method described in this thread: https://stackoverflow.com/a/36399361
My dates are based on epochs, and I was able to do the following:
SELECT
(TIMESTAMP 'epoch' + contract_start_date * INTERVAL '1 Second ')
FROM
table_name
SELECT TIMESTAMP 'epoch' + {column of bigint}/1000 * INTERVAL '1 second' as adate FROM tbl
If you are starting with a POSIX timestamp, and trying to get a timezone aware datetime value, you will need to supply a timezone - even if you later want to truncate the time part away. I'm not familiar with redshift, but perhaps there is a way to specify you mean UTC.

Postgres timestamp to unix time in milliseconds as a bigint

How can I get the following snippet to work in postgres:
ALTER TABLE mytable
ADD COLUMN create_time_utc bigint not null
DEFAULT (now() at time zone 'utc');
I want the new column create_time_utc to be the unix time in milliseconds (i.e number of milliseconds since Unix epoch January 1 1970).
I know I need to convert the postgres timestamp to a bigint, but I'm not sure how to do that.
extract(epoch
alter table mytable
add column create_time_utc bigint not null
default (extract(epoch from now()) * 1000);
http://www.postgresql.org/docs/current/static/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT