The DateTime created by DateTime.parse seems to always returns 0 for "timeZoneOffset"
I create a ISO8601 string here in a non UTC timezone: https://timestampgenerator.com/1610010318/+09:00
I pass that string to DateTime.parse:
DateTime date = DateTime.parse("2021-01-07T18:05:18+0900");
Issue: timeZoneOffset is 0 when I was expecting +0900.
print(date.timeZoneOffset); --> 0:00:00.000000
print(date.timeZoneName); --> UTC
print(date); --> 2021-01-07 09:05:18.000Z
Dart DateTime documentation (https://api.dart.dev/stable/2.10.4/dart-core/DateTime/parse.html):
The result is always in either local time or UTC. If a time zone offset other than UTC is specified, the time is converted to the equivalent UTC time.
Why is timeZoneOffset 0? What string do I need to pass to DateTime.parse to get it to store the time in local time instead of converting it to the equivalent UTC time?
The Dart SDK does not really handle different timezones which is the reason why the parse want local timezone (which is the timezone on the system running the program) or UTC.
If you are trying to parse a timestamp without any timezone information, Dart will assume the time are in local timezone (I am in Denmark which are using the Romance Standard Timezone):
void main() {
print(DateTime.parse("2021-01-07T18:05:18").timeZoneName); // Romance Standard Time
print(DateTime.parse("2021-01-07T18:05:18+0900").timeZoneName); // UTC
}
You can convert your UTC timestamp into localtime by using .toLocal() on the timestamp. But again, this will just convert it into the timezone which are on your own system and not the one coming from the time you have parsed:
void main() {
print(DateTime.parse("2021-01-07T18:05:18+0900").toLocal().timeZoneName); // Romance Standard Time
}
If you want to handle time with timezone data you should look into the package timezone: https://pub.dev/packages/timezone
Some notes about saving timezone offset
You should be aware that in most cases, it does not really makes sense to save the time in a form where you can get the original offset again. The problem is that most countries have rules like DST or the user are traveling and does expect the system to handle the time correctly.
So in a lot of cases, the user does not really expect to get the same offset back again but want the time to be with the currently offset based on location and time right now.
The timezone package does e.g. not allow you to parse a timestamp and save the offset together with it (because an offset is not the same as a location). Instead, it want you to specify the location the timestamp are used for so it can calculate the current offset for that location.
So in general, I recommend you to always save time as UTC on storage. When the data are going to be used, you should have some way to know the location of the receiver (e.g. ask the user in some form of profile) and convert the time to that location using the timezone package. If the application are running on a device owned by the receiver of the data, you can convert the UTC to local time using .toLocale().
Related
I save my date as a local date, but when I read it back, it treats it as if it was a UTC date so it slips by several hours.
The dates are passed in as strings in the form '2020-03-05 09:05:23' as query parameters but when they are retrieved they might look like '2020-03-04 10:05:23' because I am 13 hours ahead of Greenwich.
For MariaDB (or MySQL):
Use DATETIME as a picture of the clock on the wall.
Use TIMESTAMP to adjust to the system's timezone.
Set the system's timezone according to where it lives in the world.
Is there a standard for encoding a date as a timestamp? My thoughts:
This should be 12:00pm UTC in local time, eg 9:00am at T-3, therefore anyone consuming the timestamp, regardless of their -12/+12 offset, will recognize the same date, regardless of whether they parse at the UTC timezone
It could be 12:00pm at UTC
It could be the start of the day (12:00am) at UTC
It could be start of the day (12:00am UTC) in local time eg 9:00pm at T-3
Is there an official spec or standard to adhere to?
It would be easy to point to this document and say 'this is the standard' as opposed to being unaware and having to change our logic down the line.
There isn't a standard for this, because a date and a timestamp are logically two very different concepts.
A date covers the entire range of time on that day, not a specific point in time.
It may be a different date for a person in another time zone at any given point in time, but dates themselves do not have any association with time zones. Visualize a date as just a square on a calendar, not a point on a timeline.
Many APIs will use midnight (00:00) as the default time when a date-only value is assigned to a date+time value. However:
Whether it is UTC based or local-time based is very dependent on that particular API. There is no standard for this, nor is one answer necessarily better than the other.
Assigning a local-time midnight can be problematic for time zones with transitions near midnight. For example, in Santiago, Chile on 2019-09-08, the day started at 01:00 due to the start of DST. There was no 00:00 on that day.
Also, you tagged your question with momentjs. Since a Moment object is basically a timestamp (not a date), then Moment.js will generally assign the start of the day if provided a date-only value. The time zone involved is key to deciding which moment that actually is, which illustrates my prior points.
For example:
// Parsing as UTC
moment.utc('2019-09-08').format() //=> "2019-09-08T00:00:00Z"
// Parsing as Local Time (my local time zone is US Pacific Time)
moment('2019-09-08').format() //=> "2019-09-08T00:00:00-07:00"
// Parsing in a specific time zone (on a day without midnight)
moment.tz('2019-09-08', 'America/Santiago').format() //=> "2019-09-08T01:00:00-03:00"
Also keep in mind that sometimes APIs can be misnamed. The JavaScript Date object is not a date-only value, but actually a timestamp, just like a moment.
I am working in application where facing issue with time zones.
Want to convert UTC millisecond to UTC date object.
I already tried
TimeZone utcZone = TimeZone.getTimeZone("UTC");
Calendar date = getInstance(utcZone);
date.setTimeInMillis(utcMillisecond);
date.getTime();
date.getTime is still returning my local time zone that is EST. I know that millisecond that I am getting from UI is in UTC millisecond.
The old class java.util.Calendar silently applied your JVM’s current default time zone. You assumed it would be in UTC but it is not.
java.time
You are using old troublesome date-time classes that have been supplanted by the java.time framework in Java 8 and later.
I assume that by "UTC millisecond" you mean a count of milliseconds since the first moment of 1970 in UTC, 1970-01-01T00:00:00Z. That can be used directly to create a java.time.Instant, a moment on the timeline in UTC.
By the way be aware that java.time has nanosecond resolution, much finer than milliseconds.
Instant instant = Instant.ofEpochMilli( yourMillisNumber );
Call toString to generate a String as a textual representation of the date-time value in a format compliant with the ISO 8601 standard. For example:
2016-01-23T12:34:56.789Z
Im using this repo
https://github.com/remirobert/Tempo
Can someone help me understand how to grab the current time zone of the device, and then notify tempo? I am using the timeAgoNow() function of tempo to find display how long ago the post was made, but the timezone difference is messing it up. My datasource is using UTC time.
Cocoa uses UTC internally. for all of its date/time calculations.
If you create an NSDate for now:
NSDate()
You get a date that is the number of elapsed seconds since midnight, 1970 in UTC time.
Dates only have time zones when you display them.
By default logging a date to the console logs it in UTC, which can be confusing.
If I'm working on a project that does a lot of date/time calculations I'll create a debugging method that converts an NSDate to a date/time string in the current locale, which is easier to read/debug without having to mentally convert from UTC back to local time.
I have never used Tempo, so I don't know if it is using date strings, NSDate, or "internet dates" (which are also in UTC, but use a different "zero date" or "epoch date")
I'm using com.google.gwt.i18n.client.timezone to try and display a date (as at the server), but GWT automatically adds the current timezone to the date when formatting it, meaning The wrong date is shown in different timezones.
To combat this, I'm sending the server's timezone offset to the client and using that when formatting.
I live in Australia and the current timezone is +11 GMT/UTC, but the default timezone being displayed when I format the date is -11 GMT.
The offset from the server is +11 hours (as it should be), but when I try and format the date with this offset, I get the wrong date, and so I need to use the negative offset instead.
Why is the default timezone wrong?
When you are getting a date (particularly if you're parsing a date) make sure you specify the timezone. GWT's DateTimeFormat.parse only supports "RFC format" timezones, something like -0800 for Pacific time. If your server is sending dates in strings to the client, make sure it includes the timezone in this format.
Then when you convert the date to a string to present it to the user, make sure you use the overload of DateTimeFormat.format that specifies a TimeZone and pass the timezone that you want the date to be presented in (the timezone of the server, in your case.)
By default dates are presented in the timezone that the user's system is set to. Setting the default timezone in GWT (so you can ignore timezones and do everything in the server's timezone) is an open issue (3489) at the time I write this.