Converting epoch time to SQL datetime format. I am trying to convert the 35 000 records received from another table (other db) with epoch timestemp to a new table with SQL datetime format. Also I will need to have this updated on a daily basis so one time conversion is good but I am also open to other suggestions.
I never worked with epoch and I am not sure how to go about it.
Also later on I want to use it in SSRS with correct datetime so I am not sure should I convert it before the transfer to new table or to do it in SSRS?
Any ideas?
Presuming that the epoch timestamp you have is in seconds:
DATEADD(SECOND, epoch_col, '19700101')
This will add the epoch seconds to the start of 'epoch time' (01-01-1970 00:00:00) and give you a DATETIME.
Example with output:
SELECT DATEADD(SECOND, 1571994774, '19700101')
2019-10-25 09:12:54.000
If you have an epoch timestamp in milliseconds just use this variation:
DATEADD(MILLISECOND, epoch_col, '19700101')
In terms of your other question about when to convert the value; My view is that it would be preferable to store the value in a DATETIME column at point of insertion rather than storing the epoch value and converting it upon use.
This is just an opinion though and not a recommendation.
Related
I exported data from an SQLite table to a CSV file. The data includes a timestamp with at least one-minute resolution: "2019-11-15 01:30:06". The data is actually stored as a Julian date, in this case 2458802.35424295. I imported the data into a double-precision field. I need to convert that number into a timestamp with time zone. I tried casting the double-precision number to text and then using to_timestamp(), but that appears to work only with integer days. I can get a timestamp, but it is always at midnight of the correct date. I tried using to_timestamp() passing in my number, but that returns an epoch (number of milliseconds since 1/1/1970).
I could try to take the fractional part of my Julian date value, calculate the number of milliseconds since midnight that represents, use the to_timestamp(text,text) method to get the date I need, and then add the epoch since midnight to that date. But that's awfully cumbersome. Isn't there a better way?
I'm using PostgreSQL 9.3.
NOTE: The simple answer to my problem, which occured to me just before I clicked the Post button, is to export the data in the form I want, using SQLite's datetime() function to convert the number to a date string during export. But I remain curious. I would have thought there would be a standard way to do this conversion.
I'm trying to convert a epoch timecode to a date in Pentaho Spoon. I use an input text file to extract fields from. I want to export the fields in a database but there is this timestamp field that contains epoch timestamps like this "1480017396", the datatype is set as an integer and the field is named timestamp. I want to convert with it with Select value.
So I go to the next step and use the select value option to select the field and change the datatype to Date with a format of dd/MM/yyyy the result gives me all kinds of dates in 18-01-1970 range. I tried everything (Different formats etc.) but I just can't seem to solve it.
Any guesses? Image of output
The time in epoch is in miliseconds, not seconds, so, take your number, multiply by 1000, and turn to date.
See that if you divide, the date goes back a few ... and multiply it you get the correct date because of the timestamp.
in my current project I need to work with data from a certain loger. Unfortunately, I encountered a problem of coding the date of this loger. I believed that it is a timestamp date format but it is not. Can you tell me in what format is this datum
Real date:
04-Jan-2018 16:43:16
Date format:
568399396
That date-time format is called epoch, where the date is stored as an integer. It is the number of seconds/milliseconds after 1st January 1970, 00:00:00. This is how all the computers/devices store the date/time internally.
You can use n number of libraries to convert an epoch to date/time etc.
You can read more about epoch here:
https://en.wikipedia.org/wiki/Unix_time
P.s.: Are you sure that you're getting this number for 4th Jan 2018? The epoch of 04-Jan-2018 16:43:16 will be 1515064396000(in ms)
Data has hour field (String datatype). The timestamp is in milliseconds. This is working -
DATEADD('second',INT(INT([Hour])/1000),DATETIME('1970-01-01'))
However, this is NOT WORKING -
DATEADD('hour',-7,(Date("1/1/1970") + (INT(INT([Hour])/(1000*86400))))
The above is returning NULL. -7 is to adjust for my Timezone.
Was able to get it done. May help someone.
Changed the second one to this -
DATEADD('hour',0,(DATETIME("1970-01-01") + INT((INT([Hour])/(86400*1000)))))
As a reference, extract from Tableau own knowledge base:
To convert the field to UTC time, use the following calculation:
DATEADD('second', [Unix time field], #1970-01-01#)
To convert the field in Unix time to a different time zone, use the
following calculation:
DATEADD('minute', INT([Unix time field]/60 + ),
#1970-01-01#)
For example, to convert the field in Unix time to India Standard Time
(IST), use the following calculation:
DATEADD('minute', INT([Unix time field]/60 + 330), #1970-01-01#)
I'd like to import some data into a Redshift database using COPY. For reasons passing understanding one of the columns in the data is a timestamp that's given in seconds since 2000-01-01 00:00:00. Is there any way to turn these into proper timestamps on import?
Unfortunately, you cannot transform data in a Redshift COPY load. I think you will have to stage these to a load table and then do the transform in the insert to the final table.
Worth noting though that you could do this if they had used the standard Unix epoch (seconds since 1970-01-01 00:00:00) by adding TIMEFORMAT 'epochsecs' to your COPY.