Convert Date to Timestamp using Shopify Liquid - date

I building a template and I want to convert a date to a Timestamp on Shopify.
I know I can convert a timestamp to a date using this:
{{ '1560523384' | date: "%d-%m-%Y" }} // result is 14-06-2019
But I would like to do the opposite
{{ '14-06-2019' | timestamp }} // do not work
Anyway to do this using Shopify Liquid?

What you are looking at is Epoch Timestamp. To get the same use the below code in Liquid templates -
var timestamp = "{{ 'now' | date: "%s"}}";
This gives the current timestamp in Epoch Format. Here is a link to read more about date formatting in Liquid files - Link
Update
Here is an updated answer how to convert a sepcific date to epoch timestamp in shopify liquid code
var timestamp = "{{ "March 14, 2016" | date: "%s"}}";
# timespamp —- 1457928000
var timestamp = "{{ "14-06-2019" | date: "%s"}}";
# timespamp —- 1560484800
And vice versa from Timestap to Date
var _date = "{{ "now" | date: "%Y-%m-%d %H:%M" }}";
# _date —- 2021-08-24 01:14
var _date = "{{ "1560484800" | date: "%Y-%m-%d %H:%M" }}";
# _date —- 2019-06-14 00:00
You can so forth convert any timestamp in the date format in the above documentation link.

Related

PySpark convert Unix epoch to "yyyy-MM-dd HH:mm:ss" in UTC

I have a bigint column representing Unix epoch. I'd like to convert it to string of format "yyyy-MM-dd HH:mm:ss" in UTC time. I tried a few approaches but they all return local time not UTC time.
DataFrame time_df:
time_df = spark.createDataFrame(
[
(1651886168, ),
],
["epoch"]
)
root
|-- epoch: long (nullable = true)
+----------+
| epoch|
+----------+
|1651886168|
+----------+
In UTC 1651886168 is 2022-05-07 01:16:08
Incorrect approach 1:
time_df.select('*', F.from_unixtime(F.col('epoch'))).show()
+----------+-----------------------------------------+
| epoch|from_unixtime(epoch, yyyy-MM-dd HH:mm:ss)|
+----------+-----------------------------------------+
|1651886168| 2022-05-06 18:16:08|
+----------+-----------------------------------------+
Incorrect approach 2:
time_df.select('*',
F.to_utc_timestamp(F.col('epoch').cast("timestamp"), tz='UTC').alias('timestamp'),
)
+----------+-------------------+
| epoch| timestamp|
+----------+-------------------+
|1651886168|2022-05-06 18:16:08|
+----------+-------------------+
Incorrect approach 3:
time_df = time_df.select('*',
F.to_utc_timestamp(F.col('epoch').cast("timestamp"), tz='UTC').alias('timestamp'))
time_df.select('*',
F.from_utc_timestamp(F.col('timestamp'), tz='UTC').alias('timestamp2'),
).show()
+----------+-------------------+-------------------+
| epoch| timestamp| timestamp2|
+----------+-------------------+-------------------+
|1651886168|2022-05-06 18:16:08|2022-05-06 18:16:08|
+----------+-------------------+-------------------+
Your help will be greatly appreciated!
Thanks #samkart for the comment. The following worked for me.
# set the session's timezone to UTC
timezone = spark.conf.get("spark.sql.session.timeZone")
spark.conf.set("spark.sql.session.timeZone", "UTC")
# convert unix epoch to time in UTC
time_df.select('*', F.from_unixtime(F.col('epoch'))).show()
# set the session's timezone back
spark.conf.set("spark.sql.session.timeZone", timezone)
+----------+-----------------------------------------+
| epoch|from_unixtime(epoch, yyyy-MM-dd HH:mm:ss)|
+----------+-----------------------------------------+
|1651886168| 2022-05-07 01:16:08|
+----------+-----------------------------------------+

to_date in selectexpr of pyspark is truncating date time to year by default. how to avoid this?

I have a requirement where I derive the year to be loaded and then have to load the first day and last day of that year in date format to a table.
Here is what I'm doing:
boy = str(nxt_yr)+'01-01'
eoy = str(nxt_yr)+'12-31'
df_final = df_demo.selectExpr("to_date('{}','yyyy-MM-dd') as strt_dt".format(boy),"to_date('{}','yyyy-MM-dd') as end_dt".format(eoy))
spark.sql("set spark.sql.legacy.timeParserPolicy = LEGACY")
df_final.show(1)
this is giving me 2023-01-01 in both the fields in date datatype.
Is this expected behavior and if yes is there any workaround?
Note: I tried hardcoding the date as 2022-11-30 also in the code but still received the beginning of the year in the output.
Its working as expected , additionally you are missing a - within your dates you create for conversions
nxt_yr = 2022
boy = str(nxt_yr)+'-01-01'
# |
# /|\
# |
# |
eoy = str(nxt_yr)+'-12-31'
sql.sql("set spark.sql.legacy.timeParserPolicy = LEGACY")
sql.sql(f"""
SELECT
to_date('{boy}','yyyy-MM-dd') as strt_dt
,to_date('{eoy}','yyyy-MM-dd') as end_dt
"""
).show()
+----------+----------+
| strt_dt| end_dt|
+----------+----------+
|2022-01-01|2022-12-31|
+----------+----------+

How should I store Go's time.Location in Postgres?

In Postgres I store data given to me by a user with:
Column | Type | Collation | Nullable | Default
------------+--------------------------+-----------+----------+---------
id | uuid | | not null |
value | numeric | | |
date | timestamp with time zone | | |
Now I'm presented with the requirement of maintaining the original timezone in which the data was produced. The timestamp with timezone is normalized to the database's timezone and the original timezone is lost, so I must manually restore date back from the normalized timezone before serving it back to the user.
Most solutions suggest adding an extra column to the table and storing the original timezone information together with the timestamp:
Column | Type | Collation | Nullable | Default
------------+--------------------------+-----------+----------+---------
id | uuid | | not null |
value | numeric | | |
date | timestamp with time zone | | |
tz | text | | |
So given that I'm using Go, which information should I extract from time.Time to store in tz for the most precise and seamless restoration?
date.Location().String() doesn't seem right as it might return the value Local which is relative.
And how should I restore the information from tz back into to time.Time?
Is the result of time.LoadLocation(tz) good enough?
Upon save, I would obtain the zone name and offset using Time.Zone():
func (t Time) Zone() (name string, offset int)
Then when querying such a timestamp from the database, you can construct a time.Location using time.FixedZone():
func FixedZone(name string, offset int) *Location
And switch to this location using Time.In().
Word of caution! This will restore you a timestamp with "seemingly" in the same time zone, but if you need to apply operations on it (such as adding days to it), the results might not be the same. The reason for this is because time.FixedZone() returns a time zone with a fixed offset, which does not know anything about daylight savings for example, while the original timestamp you saved might have a time.Location which does know about these things.
Here's an example of such a deviation. There is a daylight saving day in March, so we'll use a timestamp pointing to March 1, and add 1 month to it, which results in a timestamp being after the daylight saving.
cet, err := time.LoadLocation("CET")
if err != nil {
panic(err)
}
t11 := time.Date(2019, time.March, 1, 12, 0, 0, 0, cet)
t12 := t11.AddDate(0, 1, 0)
fmt.Println(t11, t12)
name, offset := t11.Zone()
cet2 := time.FixedZone(name, offset)
t21 := t11.UTC().In(cet2)
t22 := t21.AddDate(0, 1, 0)
fmt.Println(t21, t22)
now := time.Date(2019, time.April, 2, 0, 0, 0, 0, time.UTC)
fmt.Println("Time since t11:", now.Sub(t11))
fmt.Println("Time since t21:", now.Sub(t21))
fmt.Println("Time since t12:", now.Sub(t12))
fmt.Println("Time since t22:", now.Sub(t22))
This will output (try it on the Go Playground):
2019-03-01 12:00:00 +0100 CET 2019-04-01 12:00:00 +0200 CEST
2019-03-01 12:00:00 +0100 CET 2019-04-01 12:00:00 +0100 CET
Time since t11: 757h0m0s
Time since t21: 757h0m0s
Time since t12: 14h0m0s
Time since t22: 13h0m0s
As you can see, the output time after the 1-month addition is the same, but the zone offset is different, so they designate a different time instant in time (which is proven by showing the time difference with an arbitrary time). The original has 2-hour offset, because it knows about the daylight saving that happened in the 1 month we skipped, while the "restored" timestamp's zone doesn't know about that, so the result has the same 1-hour offset. In the timestamp after addition, even the zone name changes in real life: from CET to CEST, and again, the restored timestamp's zone doesn't know about this either.
A more wasteful, and still prone to error, but still valid solution will be to also store the original timestamp in ISO 8601 format like 2019-05-2T17:24:37+01:00 in a separate column datetext:
Column | Type | Collation | Nullable | Default
------------+--------------------------+-----------+----------+---------
id | uuid | | not null |
value | numeric | | |
date | timestamp with time zone | | |
datetext | text | | |
Then query using date for the strength of the native timestamp column, and return to the user datetext which is the exact value that was originally sent.

substract current date with another date in dataframe scala

First of all, thank you for the time in reading my question :)
My question is the following: In Spark with Scala, i have a dataframe that there contains a string with a date in format dd/MM/yyyy HH:mm, for example df
+----------------+
|date |
+----------------+
|8/11/2017 15:00 |
|9/11/2017 10:00 |
+----------------+
i want to get the difference of currentDate with date of dataframe in second, for example
df.withColumn("difference", currentDate - unix_timestamp(col(date)))
+----------------+------------+
|date | difference |
+----------------+------------+
|8/11/2017 15:00 | xxxxxxxxxx |
|9/11/2017 10:00 | xxxxxxxxxx |
+----------------+------------+
I try
val current = current_timestamp()
df.withColumn("difference", current - unix_timestamp(col(date)))
but get this error
org.apache.spark.sql.AnalysisException: cannot resolve '(current_timestamp() - unix_timestamp(date, 'yyyy-MM-dd HH:mm:ss'))' due to data type mismatch: differing types in '(current_timestamp() - unix_timestamp(date, 'yyyy-MM-dd HH:mm:ss'))' (timestamp and bigint).;;
I try too
val current = BigInt(System.currenttimeMillis / 1000)
df.withColumn("difference", current - unix_timestamp(col(date)))
and
val current = unix_timestamp(current_timestamp())
but the col "difference" is null
Thanks
You have to use correct format for unix_timestamp:
df.withColumn("difference", current_timestamp().cast("long") - unix_timestamp(col("date"), "dd/mm/yyyy HH:mm"))
or with recent version:
to_timestamp(col("date"), "dd/mm/yyyy HH:mm") - current_timestamp())
to get Interval column.

How do I convert Epoch time to Date in Open Refine?

I don't care which language I use (as long as it's one of the three available in Open Refine), but I need to convert a timestamp returned from an API from epoch time to a regular date (see Expression box in the screenshot below). Not too picky about the output date format, just that it retains the date down to the second. Thanks!
Can use: GREL, Jython, or Clojure.
If you have to stick to GREL you can use the following one-liner:
inc(toDate("01/01/1970 00:00:00","dd/MM/YYYY H:m:s"),value.toNumber(),"seconds").toString('yyyy-MM-dd HH:mm:ss')
Breaking it down:
inc(date d, number value, string unit) as defined in the GREL documentation : Returns a date changed by the given amount in the given unit of time. Unit defaults to 'hour'
toDate(o, string format) : Returns o converted to a date object. (more complex uses of toDate() are shown in the GREL documentation)
We use the string "01/01/1970 00:00:00" as input for toDate() to get the start of the UNIX Epoch (January 1st 1970 midnight).
We pass the newly created date object into inc() and as a second parameter the result of value.toNumber() (assuming value is a string representation of the number of seconds since the start of the Unix Epoch), as a 3rd parameter, the string "seconds" which tells inc() the unit of the 2nd parameter.
We finally convert the resulting date object into a string using the format: yyyy-MM-dd HH:mm:ss
Test Data
Following is a result of using the function described above to turn a series of timestamps grabbed from the Timestamp Generator into string dates.
| Name | Value | Date String |
|-----------|------------|---------------------|
| Timestamp | 1491998962 | 2017-04-09 12:09:22 |
| +1 Hour | 1492002562 | 2017-04-09 13:09:22 |
| +1 Day | 1492085362 | 2017-04-10 12:09:22 |
| +1 Week | 1492603762 | 2017-04-16 12:09:22 |
| +1 Month | 1494590962 | 2017-05-09 12:09:22 |
| +1 Year | 1523534962 | 2018-04-09 12:09:22 |
Unfortunately, I do not think you can do it with a GREL statement like this or somesuch, but I might be pleasantly surprised by someone else that can make it work somehow:
value.toDate().toString("dd/MM/yyy")
So in the meantime, use this Jython / Python Code:
import time;
# This is a comment.
# We change 'value' to an integer, since time needs to work with numbers.
# If we needed to, we could also * 1000 if we had a Unix Epoch Time in seconds, instead of milliseconds.
# We also have no idea what the local time zone is for this, which could affect the date. But we digress...
epochlong = int(float(value));
datetimestamp = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(epochlong));
return datetimestamp