I want to convert a epoch time say 1639514232 to time UUID and save it to cassandra. UUIDs.endOf() is not working and generating a decipherable uuid 5e23b68f-2cbb-11b2-7f7f-7f7f7f7f7f7f .
Can you suggest how can I achieve it.
Related
I'm receiving long value from Api and I'am trying to create type convertor to convert long values to data time or date. How can I do it?
I already have a SO question and answer on how to convert BSON Timestamp in a MongoDB aggregation, but now I have a situation where I would like to convert in node.js.
So just to repeat. My goal is to convert a "Timestamp" datatype to a javascript date, without doing it in an aggregation - is this possible?
If the BSON library you are using provides Timestamp type, you can use its getTime method to return the seconds, and create a date from that:
Date(object.clusterTime.getTime())
If you don't have that function available, the timestamp is a 64-bit value where the high 32-bits are the seconds since epoch, and the low 32-bits are a counter.
Bitshift the value 32 bits right or divide by 2^32 to get the seconds:
Date(object.clusterTime/Math.pow(2,32))
I have a column in Timestamp format that includes milliseconds.
I would like to reformat my timestamp column so that it does not include milliseconds. For example if my Timestamp column has values like 2019-11-20T12:23:13.324+0000, I would like my reformatted Timestamp column to have values of 2019-11-20T12:23:13
Is there a straight forward way to perform this operation in spark-scala? I have found lots of posts on converting string to timestamp but not for changing the format of a timestamp.
You can try trunc.
See more examples: https://sparkbyexamples.com/spark/spark-date-functions-truncate-date-time/
I exported data from an SQLite table to a CSV file. The data includes a timestamp with at least one-minute resolution: "2019-11-15 01:30:06". The data is actually stored as a Julian date, in this case 2458802.35424295. I imported the data into a double-precision field. I need to convert that number into a timestamp with time zone. I tried casting the double-precision number to text and then using to_timestamp(), but that appears to work only with integer days. I can get a timestamp, but it is always at midnight of the correct date. I tried using to_timestamp() passing in my number, but that returns an epoch (number of milliseconds since 1/1/1970).
I could try to take the fractional part of my Julian date value, calculate the number of milliseconds since midnight that represents, use the to_timestamp(text,text) method to get the date I need, and then add the epoch since midnight to that date. But that's awfully cumbersome. Isn't there a better way?
I'm using PostgreSQL 9.3.
NOTE: The simple answer to my problem, which occured to me just before I clicked the Post button, is to export the data in the form I want, using SQLite's datetime() function to convert the number to a date string during export. But I remain curious. I would have thought there would be a standard way to do this conversion.
Converting epoch time to SQL datetime format. I am trying to convert the 35 000 records received from another table (other db) with epoch timestemp to a new table with SQL datetime format. Also I will need to have this updated on a daily basis so one time conversion is good but I am also open to other suggestions.
I never worked with epoch and I am not sure how to go about it.
Also later on I want to use it in SSRS with correct datetime so I am not sure should I convert it before the transfer to new table or to do it in SSRS?
Any ideas?
Presuming that the epoch timestamp you have is in seconds:
DATEADD(SECOND, epoch_col, '19700101')
This will add the epoch seconds to the start of 'epoch time' (01-01-1970 00:00:00) and give you a DATETIME.
Example with output:
SELECT DATEADD(SECOND, 1571994774, '19700101')
2019-10-25 09:12:54.000
If you have an epoch timestamp in milliseconds just use this variation:
DATEADD(MILLISECOND, epoch_col, '19700101')
In terms of your other question about when to convert the value; My view is that it would be preferable to store the value in a DATETIME column at point of insertion rather than storing the epoch value and converting it upon use.
This is just an opinion though and not a recommendation.