Microsoft Hex dates - postgresql

I have the following from a Microsoft SQL Server database for date/time value:
0x00009CEF00A25634
I found this post:
Help me translate long value, expressed in hex, back in to a date/time
Which seemed to be on the right track but by using the code I didn't get the right dates, are my hex dates in a different format? How would I convert them to a normal date, I am using PHP/PostgreSQL.

select CAST (0x00009CEF00A25634 as datetime) gives 2009-12-30 09:51:03.000
This is two integers. One for the date part 0x00009CEF (decimal 40175) and one for the time part 00A25634 (decimal 10638900). The date part is a signed integer giving number of days since 1 Jan 1900. The time part is an integer representing number of ticks.
There are 300 ticks in a second.
It can be seen that the following also returns the same result
SELECT DATEADD(MILLISECOND,10638900*10/3.0, DATEADD(DAY,40175, '19000101'))
You will need to figure out how to apply this to postgres.
Edit: an answer here apparently does this. I haven't tested it myself.

This works for me while migrating from SQL to MySQL :
SELECT (CAST('1900-01-01 00:00:00' + INTERVAL CAST(CONV(substr(HEX( 0x0000A249004576D0 ),1,8), 16, 10) AS SIGNED) DAY + INTERVAL CAST(CONV(substr(HEX( 0x0000A249004576D0 ),9,8), 16, 10) AS SIGNED)* 10000/3 MICROSECOND AS DATETIME)) AS newdate

Related

how to keep datatype when substracting day from date/time column in SAS

My question is really simple, hope someone deigns to answer!.
Being very to new SAS, the date and its formats really is confusing me.
I have timestamp column from which I need to substract 2 days while keeping its datatype
The value of the column is "2022-04-20-19.37.57.714699"
What I need is "2022-04-18-19.37.57.714699"
When I try this I get number datatype:
PROC SQL;
CREATE TABLE my_table AS
SELECT
cust_id,query_date, (query_date)-2 as calc_date
FROM other_table
;quit;
I try format,datetime function, but ended up with "Statement is not valid or it is used out of proper order"
Thanks
Assuming that the QUERY_DATE variable is numeric and has datetime values in it (the number of seconds since 1960) then you can use the INTNX() function with the DTDAY interval to adjust the value by two days. To keep the same time of day use SAME for the alignment parameter.
intnx('dtday',query_date,-2,'same')
Alternatively you could just subtract 48 hours worth of seconds from the value.
query_date -2*'24:00:00't
If you want the values to display in a human readable way then attach any of the many datetime formats, such as DATETIME to the new variable.
CREATE TABLE my_table AS
SELECT cust_id,query_date
, intnx('dtday',query_date,2,'same') as calc_date format=datetime20.
FROM other_table
;
If the variable is just a string then you cannot subtract from strings. You will have to convert the strings into numbers to perform arithmetic. You probably have too many decimal places for SAS datetime informats/formats to replicate (and perhaps to be uniquely stored in a floating point value) so just convert the date part and then append back the rest to keep the same time of day. Since dates are stored as number of days you can just subtract the 2 days using normal subtraction.
put(input(query_date,yymmdd10.)-2,yymmdd10.)||substr(query_date,11)
This should solve your problem. The first issue to tackle is converting the date string into a sas datetime. The input() function with the anydtdtm. informat accomplishes that with a small caveat as seen in the output.
data test;
date_txt = '2022-04-20-19.37.57.714699';
query_date = input(date_txt, anydtdtm.); * convert string into sas datetime;
calc_date = intnx('dtday', query_date, -2, 's'); * backup 2 days preserving the time;
format query_date calc_date e8601dt26.6;
run;
date_txt
query_date
calc_date
2022-04-20-19.37.57.714699
2022-04-20T19:37:57.000000
2022-04-18T19:37:57.000000
The default width of the informat is 19 characters which excludes the fractional seconds, but the informat correctly converted the string into a datetime.
To attempt to capture the full width of the date string, I modified the informat to anydtdtm26.. However, that change resulted in an error and missing values for query_date and calc_date. Although the anydtdtm informat is robust with converting a wide variety of date and time formats, I suspected that the problem lies with the periods used to delimit the hours and minutes.
To correct that problem I used prxchange() function to replace the periods after hours and minutes with colons which are standard time componenent delimiters. That change allows the anydtdtm informat to properly convert the fractional seconds.
data test2;
date_txt = '2022-04-20-19.37.57.714699';
date_mod = prxchange('s/\./:/', 2, date_txt); * replace first 2 periods w/ colons;
query_date = input(date_mod, anydtdtm26.); * convert string into sas datetime;
calc_date = intnx('dtday', query_date, -2, 's'); * backup 2 days preserving the time;
format query_date calc_date e8601dt26.6;
run;
date_txt
date_mod
query_date
calc_date
2022-04-20-19.37.57.714699
2022-04-20-19:37:57.714699
2022-04-20T19:37:57.714699
2022-04-18T19:37:57.714699
Although I used a data step to illustrate the solution the functions can also be used in a SQL statement.

Converting DateTime to Epoch milliseconds using Cypher in Neo4J

I'm running a query using Cypher in Neo4J where I have to compare a createdAt property of a node against a given time unit in Epoch milliseconds. This createdAt property is a string in the DateTime format, which is defined as -
DateTime
date with a precision of miliseconds, encoded as a string with the following format: yyyy-mm-ddTHH:MM:ss.sss+0000, where yyyy is
a four-digit integer representing the year, the year, mm is a
two-digit integer representing the month and dd is a two-digit integer
representing the day, HH is a two-digit integer representing the hour,
MM is a two digit integer representing the minute and ss.sss is a five
digit fixed point real number representing the seconds up to
milisecond precision. Finally, the +0000 of the end represents the
timezone, which in this case is always GMT.
Here are a couple of values of this property - 2011-03-21T19:32:38.295+0000, 2012-03-09T17:59:05.367+0000.
I came across the Temporal Values documentation on Neo4j, but couldn't find a way to perform the conversion.
When I execute some of the given examples, like this -
RETURN datetime('2015-06-24T12:50:35.556+0100') AS theDateTime
I get the error -
Neo.ClientError.Statement.SyntaxError: Unknown function 'datetime' (line 1, column 16 (offset: 15))
Would appreciate any help!
The temporal functions were added in neo4j version 3.4.0, and I have verified that your query works in that version.
Make sure you are using an appropriately recent version of neo4j.

Converting string timestamp into date

I have dates in a postgres database. The problem is they are stored in a string field and have values similar to: "1187222400000" (which would correspond to 07.08.2007).
I would like to convert them into readable dates usind some SQL to_date() expression or something similar but can't come up with the correct syntax to make it work.
There really isn't enough information here for a conclusion, so I propose this 'scientific-wild-ass-guess' to resolve your puzzle. :)
It appears this number is UNIX 'epoch time' in milliseconds. I'll show this example as if your string field had the arbitrary name, 'epoch_milli'. In postgresql you can convert it to a time stamp using this statement:
SELECT TIMESTAMP WITH TIME ZONE 'epoch' + epoch_milli * INTERVAL '1 millisecond';
or using this built-in postgresql function:
SELECT to_timestamp(epoch_milli / 1000)
either of which, for the example '1187222400000', produces the result
"2007-08-15 17:00:00-07"
You can do some of your own sleuthing with quite a few values selected similarly to this:
SELECT to_timestamp(epoch_milli/1000)::DATE
FROM (VALUES (1187222400000),(1194122400000)) AS val(epoch_milli);
"Well, bollocks, man. I just want the date." Point taken.
Simply cast the timestamp to a date to discard the excess bits:
SELECT to_timestamp(epoch_milli / 1000)::DATE
Of course its possible that this value is a conversion or is relative to some other value, hence the request for a second example data point.

Converting dates and timestamps when inserting data into Teradata

I am block-inserting data from Stata (a statistics package) into a Teradata database. I am having trouble converting dates and timestamps from Stata's native format to Teradata's.
Stata stores dates as days since 01/01/1960, so that 01jan1960 is 0 and 02jan1960 is 1. Timestamps are stored as milliseconds since 01jan1960 00:00:00.000, so that 1000 is 01jan1960 00:00:01. Here are some examples:
timestamp Stata's tstamp date Stata's date
2015-04-13 03:07:08 1744513628000 2015-04-13 20191
2015-04-14 19:55:43 1744660543000 2015-04-14 20192
2015-04-08 11:41:39 1744112499000 2015-04-08 20186
2015-04-15 06:53:34 1744700014000 2015-04-15 20193
I tried 2 approaches. The first involves converting the dates/timestamps to strings in Stata before inserting and then doing something like this once the data is inserted:
ALTER TABLE mytable ALTER date_variable DATETIME
However, I cannot figure out how to do the second part from the documentation I have and after searching the various fora.
The second approach is leaving the dates and timestamps as integers, and then doing some of conversion once the integers are inserted. Perhaps I can also pre-convert dates in Stata to TD's internal format with:
gen td_date = ((year(stata_dt)-1900)*10000 + month(stata_dt)*100 + day(stata_dt))
However, I am not sure what the formula for timestamps would be. I am also not sure how to do the second part (making the integers into dates/timestamps).
You can't change the datatype of a column in Teradata from string to date/timestamp.
But when you insert a string into a date/timestamp column there will be an automatic typecast. So simply convert to a string with 'yyyy-mm-dd' or 'yyyy-mm-dd hh:mi:ss' format.
You could also do the conversion during load on Teradata using calculations, but IMHO the 1st solution is preferable:
-- add the number of days to the start date
DATE '1960-01-01' + stata_dt
-- I use a similar approach for Unix Timestamps starting 1970 :-)
-- split into days and seconds
CAST(DATE '1960-01-01' + (stata_ts / 86400000) AS TIMESTAMP(0))
+ ((stata_ts MOD 86400000 / 1000) * INTERVAL '00:00:01' HOUR TO SECOND)

Parsing "date" field of iPhone SMS file from backup

While this isn't a programming question per se, it IS related.
So I'm trying to figure out how to parse the SMS DB that gets backed up from the iPhone. I'm looking at the "messages" table, specifically the "date" field. I noticed that the more recent messages are using a different numbering system to indicate the date/time. I've narrowed it down to the switch to iMessage, as I have a message sent at 1318470904, with a reply sent at 340164736. I know for a fact that these messages were sent less than an hour apart, yet they're indicating > 30 years' difference.
Anybody know how to accurately calculate the date using this newer system? Is it using a different epoch or is there some crazy math I need to do?
Edit: Recent messages are affected as well. Texts (green bubbles) are stored with the date set normally, and anything through iMessage (blue bubbles) is stored with the different date representation.
Since the backup is exported to SQLite database format, here's how to convert the number to a real date in SQLite:
select
datetime(date + strftime('%s', '2001-01-01 00:00:00'),
'unixepoch', 'localtime') as date,
*
from message
I don't know about getting the correct date given two versions present, but when I did this today, I noticed the date column was not the standard unix time but a longer number with seemingly nine zeros at the end, like 444548608000000000. This is what I did to get the correct date:
select
datetime(substr(date, 1, 9) + 978307200, 'unixepoch', 'localtime') as f_date,
text
from message
It is in seconds since 1/1/2001 instead of the others which are Unix based off of 1/1/1970. So to convert it to say an Excel time your formula would be =Cell/(60*60*24) + "1/1/2001".
Apple uses Mac Absolute Time (MacTime). This is counted from 01-01-2001. The other timestamp you see is UnixTime. This starts from 01-01-1970.
You have to add 31 years to MacTime to get UnixTime. This is a PHP-snippit:
$macTime = $d['ZMESSAGEDATE']; // MacTime column (from Whatsapp)
$unixTime = $macTime + 978307200;
echo date("Y-m-d H:i:s", $unixTime);
The time difference is calculated using this website:
https://www.timeanddate.com/date/durationresult.html?d1=1&m1=1&y1=1970&d2=1&m2=1&y2=2001&h1=0&i1=0&s1=0&h2=0&i2=0&s2=0
There may be another answer.
=Cell/(60*60*24) + "1/1/1970"
works with my current version of the iPhone/iOS => 4.3.3
Fomurla with time of messages:
=Cell/(60*60*24) + "1/1/2001 7:00"
Since date in mac is calculated from 2001 and not 1970, we have to add some extra to this mac date.
978307200000 is equivalent to milliseconds until 2001-01-01
Also multiplying by 1000 is required to convert to milli-seconds.
macDate * 1000 + 978307200000
Bohemian♦ is right, but there's a little typo in his answer:
use %S (capitals) instead of %s, since the time is represented in seconds since 2001 and not 1970!
Doc from https://www.sqlite.org/lang_datefunc.html
%s seconds since 1970-01-01
%S seconds: 00-59
select
datetime(date + strftime('%S', '2001-01-01 00:00:00'),
'unixepoch', 'localtime') as date,
*
from message