I am using the postgresSQL function to_timestamp(double precision) to convert from epoch time to normal timestamp but I am facing a problem where the timestamp is incorrect
SELECT to_timestamp(1428058548491);
produces "47223-05-17 12:08:11.000064+02"
while it should be 4/3/2015, 12:55:48 PM GMT+2:00 DST
SELCT to_timestamp(1428058557697);
produces "47223-05-17 14:41:36.999936+02"
while it should be 4/3/2015, 12:55:57
as can be seen the dates have been converted totally incorrect
As was explained from people In the comments for people who get the same problem. The function to_timestamp() expects it in seconds not milliseconds therefore that is the solution.
Quote from the manual:
it accepts a double precision argument and converts from Unix epoch (seconds since 1970-01-01 00:00:00+00) to timestamp with time zone
Related
I have a BIGINT value which represents a UNIX timestamp (epoch). How can I convert it to the built-in TIMESTAMP type?
As example, I want to turn 1611140400 into the related date and time. TIMESTAMP_FORMAT does not work.
You can use datetime arithmetics in Db2 and Db2 on Cloud. For Db2 on Cloud (which is running in UTC):
VALUES (TIMESTAMP('1970-01-01') + 1611140400 seconds)
Epoch is seconds since January 1st, 1970 GMT / UTC. Thus, adding your number as seconds to that date will give:
2021-01-20 11:00:00.0
If you are running in a different timezone, you need to take care of it, e.g.:
VALUES (TIMESTAMP(‘1970-01-01-00.00.00.000000’) + 1611140400 seconds + current timezone)
I'm trying to parse the following utc timestamps in kdb
t:1605083972601 1605083972853 1605083972854 1605083972860 1605083972865
such that it resolves at least to the millisecond. I have tried to do the following:
`datetime$("P"$10$string[`long$t])
`datetime$("P"$10$string[`long$(t*1000)])
which both return:
2020.11.11T08:39:32.000 2020.11.11T08:39:32.000 2020.11.11T08:39:32.000 2020.11.11T08:39:32.000
Obviously having it round to the second is inadequate.
How does one effectively achieve this in kdb?
Thanks
Following function converts unix to Kdb timestamp:
{`timestamp$(1000000*x)+`long$1970.01.01D0-2000.01.01D0}
Kdb timestamp starts from 1st Jan 2000. That is why unix timestamp has to be adjusted by 1970.01.01 and 2000.01.01 difference
Another approach which makes it explicit that what you have are number of milliseconds since unix epoch:
q)f:1970.01.01+0D00:00:00.001*
q)f 1605083972601 1605083972853 1605083972854 1605083972860 1605083972865
2020.11.11D08:39:32.601000000 2020.11.11D08:39:32.853000000 2020.11.11D08:39:..
Converting epoch time to SQL datetime format. I am trying to convert the 35 000 records received from another table (other db) with epoch timestemp to a new table with SQL datetime format. Also I will need to have this updated on a daily basis so one time conversion is good but I am also open to other suggestions.
I never worked with epoch and I am not sure how to go about it.
Also later on I want to use it in SSRS with correct datetime so I am not sure should I convert it before the transfer to new table or to do it in SSRS?
Any ideas?
Presuming that the epoch timestamp you have is in seconds:
DATEADD(SECOND, epoch_col, '19700101')
This will add the epoch seconds to the start of 'epoch time' (01-01-1970 00:00:00) and give you a DATETIME.
Example with output:
SELECT DATEADD(SECOND, 1571994774, '19700101')
2019-10-25 09:12:54.000
If you have an epoch timestamp in milliseconds just use this variation:
DATEADD(MILLISECOND, epoch_col, '19700101')
In terms of your other question about when to convert the value; My view is that it would be preferable to store the value in a DATETIME column at point of insertion rather than storing the epoch value and converting it upon use.
This is just an opinion though and not a recommendation.
I need to change the following sql query to the postgres format. how can I do that?
eg:
round((TIME_TO_SEC(testruntest.endtime) - TIME_TO_SEC(testruntest.starttime))/60,2)
I tried this query and got error as "time_to_sec" is not a supported function...
Use the SQL standard EXTRACT function:
EXTRACT(epoch FROM testruntest.endtime)
The documentation describes:
For timestamp with time zone values, the number of seconds since 1970-01-01 00:00:00 UTC (can be negative); for date and timestamp values, the number of seconds since 1970-01-01 00:00:00 local time; for interval values, the total number of seconds in the interval
Data has hour field (String datatype). The timestamp is in milliseconds. This is working -
DATEADD('second',INT(INT([Hour])/1000),DATETIME('1970-01-01'))
However, this is NOT WORKING -
DATEADD('hour',-7,(Date("1/1/1970") + (INT(INT([Hour])/(1000*86400))))
The above is returning NULL. -7 is to adjust for my Timezone.
Was able to get it done. May help someone.
Changed the second one to this -
DATEADD('hour',0,(DATETIME("1970-01-01") + INT((INT([Hour])/(86400*1000)))))
As a reference, extract from Tableau own knowledge base:
To convert the field to UTC time, use the following calculation:
DATEADD('second', [Unix time field], #1970-01-01#)
To convert the field in Unix time to a different time zone, use the
following calculation:
DATEADD('minute', INT([Unix time field]/60 + ),
#1970-01-01#)
For example, to convert the field in Unix time to India Standard Time
(IST), use the following calculation:
DATEADD('minute', INT([Unix time field]/60 + 330), #1970-01-01#)