I need to convert
2023-01-31T14:11:36-05:00 to 2023-01-31 19:11:36
I am able to do this in presto CAST(From_iso8601_timestamp(timestamp) AS timestamp)
need to replicate this in my pyspark job, I would appreciate if we can convert the string to datetime in EST hours.
https://spark.apache.org/docs/3.1.3/api/python/reference/api/pyspark.sql.functions.to_utc_timestamp.html
Doesnt day what to when the format is ‘2018-03-13T06:18:23+00:00’.
Related
I am trying the below method but it is still failing
spark.sql("select cast(to_date(current_timestamp(),'yyyymmddhhmmss') as varchar(15))")
I have two string values which contains Start Time and End Time. They are in following format:
StartTime 07:00 AM
EndTime 10:00 PM
I need to check whether Current Date Time is within these Start and End times.
Therefore, I need to convert above two strings to ISO datetime.
What is the best way to convert [HH:MM AM/PM] string in to ISO date time in Mongo DB?
while converting from csv to parquet, using AWS glue ETL job following mapped fields in csv read as string to date and time type.
this is the actual csv file
after mapping and converting, date filed is empty and time is concatenated with today's date
How to convert with proper date and time format?
It uses presto datatypes so data should be in correct format
DATE Calendar date (year, month, day).
Example: DATE '2001-08-22'
TIME Time of day (hour, minute, second, millisecond) without a time
zone. Values of this type are parsed and rendered in the session time
zone.
Example: TIME '01:02:03.456'
TIMESTAMP Instant in time that includes the date and time of day
without a time zone. Values of this type are parsed and rendered in
the session time zone.
Example: TIMESTAMP '2001-08-22 03:04:05.321'
You may use:
from pyspark.sql.functions import to_timestamp, to_date, date_format
df = df.withColumn(col, to_timestamp(col, 'dd-MM-yyyy HH:mm'))
df = df.withColumn(col, to_date(col, 'dd-MM-yyyy'))
df = df.withColumn(col, date_format(col, 'HH:mm:ss'))
i make a query about bigint to timestamp and value is '1494257400'
i will use a presto query
but presto is not collect result about from_unixtime() function.
hive version.
select from_unixtime(1494257400) result : '2017-05-09 00:30:00'
presto version.
Blockquote
select from_unixtime(1494257400) result : '2017-05-08 08:30:00'
hive gave a collect result, but presto is not collect result. how i can solve about it?
The presto from_unixtime returns you a date at UTC when the one from Hive returns you a date on your local time zone.
According to https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF, from_unixtime:
Converts the number of seconds from unix epoch (1970-01-01 00:00:00
UTC) to a string representing the timestamp of that moment in the
current system time zone in the format of "1970-01-01 00:00:00".
The output of Hive is not that good because ISO formatted strings should show GMT data if they have any which are not GMT+00.
With Hive, you can use to_utc_timestamp({any primitive type} ts, string timezone) to convert your timestamp to the proper timezones. Take a look at the manual whose link is provided above.
I am trying to convert a numeric to timestamp in postgresql. However, it always converts it in EST timezone. Before running the query, I try the following.
set time zone 'UTC+10';
select to_timestamp(r.date/1000) as current_email_timestamp
FROM email;
However, It the timestamp always seem to be in US timezone. The emails are mostly sent during working hours but when I change the numeric back to timestamp with above query, it shows all the emails at night time
Could it be that the timestamp in numeric was stored in US timezone, or could it be that when converting back from numeric to timestamp, it is not coverting the timezone correctly.
Can someone please tell me how to fix this
Regards
Arif
You can use the at time zone modifier:
select to_timestamp(1411738200),
to_timestamp(1411738200) at time zone 'America/Chicago',
to_timestamp(1411738200) at time zone 'UTC+10'
Your postgres installation probably defaults to EST.