I'm developing a Clio integration with access to the calendar, but there's been an issue with dates. While the documentation says they expect an ISO-8601 timestamp date, it seems like there's something adding offset to the timezone value in dates being sent to the system.
For example, if I send a date 2018-05-17T23:59:59.999999-04:00 on both start_at and end_at properties when creating a calendar entry for an all day event, the value returned when fetching this entry through the API is 2018-05-17T17:00:00-07:00, which is clearly wrong. Am I missing something here?
The expected result should be something like either 2018-05-17T23:59:59-04:00 or 2018-05-18T03:59:59Z if milliseconds are ignored.
All dates are based on UTC timezone. Could it be that your site/server/script is set to a local timezone and so the dates are off for part of the day?
Try setting your scripting environment to UTC time before making any date/time-based queries.
Related
I save my date as a local date, but when I read it back, it treats it as if it was a UTC date so it slips by several hours.
The dates are passed in as strings in the form '2020-03-05 09:05:23' as query parameters but when they are retrieved they might look like '2020-03-04 10:05:23' because I am 13 hours ahead of Greenwich.
For MariaDB (or MySQL):
Use DATETIME as a picture of the clock on the wall.
Use TIMESTAMP to adjust to the system's timezone.
Set the system's timezone according to where it lives in the world.
Is there a standard for encoding a date as a timestamp? My thoughts:
This should be 12:00pm UTC in local time, eg 9:00am at T-3, therefore anyone consuming the timestamp, regardless of their -12/+12 offset, will recognize the same date, regardless of whether they parse at the UTC timezone
It could be 12:00pm at UTC
It could be the start of the day (12:00am) at UTC
It could be start of the day (12:00am UTC) in local time eg 9:00pm at T-3
Is there an official spec or standard to adhere to?
It would be easy to point to this document and say 'this is the standard' as opposed to being unaware and having to change our logic down the line.
There isn't a standard for this, because a date and a timestamp are logically two very different concepts.
A date covers the entire range of time on that day, not a specific point in time.
It may be a different date for a person in another time zone at any given point in time, but dates themselves do not have any association with time zones. Visualize a date as just a square on a calendar, not a point on a timeline.
Many APIs will use midnight (00:00) as the default time when a date-only value is assigned to a date+time value. However:
Whether it is UTC based or local-time based is very dependent on that particular API. There is no standard for this, nor is one answer necessarily better than the other.
Assigning a local-time midnight can be problematic for time zones with transitions near midnight. For example, in Santiago, Chile on 2019-09-08, the day started at 01:00 due to the start of DST. There was no 00:00 on that day.
Also, you tagged your question with momentjs. Since a Moment object is basically a timestamp (not a date), then Moment.js will generally assign the start of the day if provided a date-only value. The time zone involved is key to deciding which moment that actually is, which illustrates my prior points.
For example:
// Parsing as UTC
moment.utc('2019-09-08').format() //=> "2019-09-08T00:00:00Z"
// Parsing as Local Time (my local time zone is US Pacific Time)
moment('2019-09-08').format() //=> "2019-09-08T00:00:00-07:00"
// Parsing in a specific time zone (on a day without midnight)
moment.tz('2019-09-08', 'America/Santiago').format() //=> "2019-09-08T01:00:00-03:00"
Also keep in mind that sometimes APIs can be misnamed. The JavaScript Date object is not a date-only value, but actually a timestamp, just like a moment.
Im using this repo
https://github.com/remirobert/Tempo
Can someone help me understand how to grab the current time zone of the device, and then notify tempo? I am using the timeAgoNow() function of tempo to find display how long ago the post was made, but the timezone difference is messing it up. My datasource is using UTC time.
Cocoa uses UTC internally. for all of its date/time calculations.
If you create an NSDate for now:
NSDate()
You get a date that is the number of elapsed seconds since midnight, 1970 in UTC time.
Dates only have time zones when you display them.
By default logging a date to the console logs it in UTC, which can be confusing.
If I'm working on a project that does a lot of date/time calculations I'll create a debugging method that converts an NSDate to a date/time string in the current locale, which is easier to read/debug without having to mentally convert from UTC back to local time.
I have never used Tempo, so I don't know if it is using date strings, NSDate, or "internet dates" (which are also in UTC, but use a different "zero date" or "epoch date")
I'm using com.google.gwt.i18n.client.timezone to try and display a date (as at the server), but GWT automatically adds the current timezone to the date when formatting it, meaning The wrong date is shown in different timezones.
To combat this, I'm sending the server's timezone offset to the client and using that when formatting.
I live in Australia and the current timezone is +11 GMT/UTC, but the default timezone being displayed when I format the date is -11 GMT.
The offset from the server is +11 hours (as it should be), but when I try and format the date with this offset, I get the wrong date, and so I need to use the negative offset instead.
Why is the default timezone wrong?
When you are getting a date (particularly if you're parsing a date) make sure you specify the timezone. GWT's DateTimeFormat.parse only supports "RFC format" timezones, something like -0800 for Pacific time. If your server is sending dates in strings to the client, make sure it includes the timezone in this format.
Then when you convert the date to a string to present it to the user, make sure you use the overload of DateTimeFormat.format that specifies a TimeZone and pass the timezone that you want the date to be presented in (the timezone of the server, in your case.)
By default dates are presented in the timezone that the user's system is set to. Setting the default timezone in GWT (so you can ignore timezones and do everything in the server's timezone) is an open issue (3489) at the time I write this.
I just want to clarify if I am understanding how dates & time zones work.
Basically, I have a date string #"2008-07-06 12:08:49" that I want to convert to an NSDate. I want this date and time to be in whatever the current user's time zone is set in. So if they are in GMT or HST, it's still 12:08:49.
If I have date in unix form 1215382129 (UTC) and my time zone is set to London (GMT), the outputted date from NSLog() is:
2008-07-06 12:08:49 +0100
If I then change my time zone to Hawaii (HST) and output the same date, I get:
2008-07-06 12:08:49 -1000
This seems to work fine, but I was under the impression to get the time in Hawaiian, I'd have to physically add the time difference (-10hrs) to the unix time stamp. Is this not required then?
Does that mean, whatever date and time a unix time is pointing to, it always points to the same date and time in whatever time zone a user is in?
Hope this makes sense!
Edit
I've just realised (thanks to Kevin Conner!) that in fact NSDateFormatter is creating different unix timestamps for that date string depending on the current timezone! So I was totally wrong!! :-)
Disclaimer, I'm mostly a Java guy. But Cocoa seems to work like the Java library in this regard: Dates are zoneless timestamps. Time zones are in the domain of formatting dates for display. In other words, the internal format doesn't consider time zones, it's all in UTC. Time zones are relatively a convenience for humans, so they are in the display/parsing side.
I noticed there is a setTimeZone: method on NSDateFormatter. Try calling that on your formatter before performing the format.