Beginning in Chrome 67 (released on 29 May 2018) there was a change to how timezones are handled in the JavaScript Date object. Historical dates will now have a historically accurate timezone offset applied to them which means that, if you were supplying a UTC date/time to one of the new Date() overloads and then retrieving local time, the value you get back may have changed from Chrome 66 to Chrome 67.
For example, with the machine timezone set to London, if you evaluate...
new Date(0).getHours()
- 0 ... on Chrome 66
- 1 ... on Chrome 67
0 here is treated as a milliseconds offset from Unix epoch time (1970-01-01T00:00:00Z) so the date is holding a value of 1970-01-01T00:00:00Z.
getHours returns a local timezone adjusted version of that value which, by today’s timezone offset rules, is still ‘0’ because on 1st Jan we’re at UTC+0 and the historical time adjusted by today's offset rules is what Chrome 66 gives us.
Chrome 67 applies the correct historical offset that was in effect on that date for the current system timezone. Weirdly in 1969 and 1970 London didn't observe a daylight saving change and was at UTC+1 for the whole year hence the value returned by getHours() is `1` because the local time was 1970-01-01T01:00:00+01:00.
A common use of the new Date(milliseconds) constructor overload method of creating a Date object is using a small number of milliseconds to format a duration e.g. "01:30:38 remaining" discarding the date part altogether and a similar problem to this was highlighted by Rik Driever in his post on the change [1].
It was Rik's post that led me to the Chromium change which introduced this new behaviour [2] and the related change to the ECMA standard [3].
In Rik's post he concludes that his issue is down to incompatibility between JavaScript and the .NET's JavaScriptSerializer and he attempts various workarounds to try and account for the offset being applied to the Date object without much success.
In fact, JavaScript and .NET are working together fine, and there are two easy ways to get your intended value back out of the Date.
Option 1 - Use the getUTC* methods instead
The millisecond value we're passing in is UTC and what we really expect to get out is also UTC so we should use the getUTC* methods instead e.g. getUTCHours(), getUTCMinutes(). The fact that getHours() was returning the value we expected in Chrome 66 and before was a coincidence and we should never have been using getHours() in the first place.
Some code coincidentally giving you the right value so you assume it's correct is a very common cause of bugs and this is a great example. It's also a great example of why you should use constants as the expected values in unit tests because if you were to use a value returned by new Date(blah).getHours() as your expected value your test would still pass.
Option 2 - Initialise the Date with a local time
If you want to keep using the local offset methods of the Date e.g. getHours(), getMinutes() then you can initialize the date slightly differently to get the result you expect:
new Date(1970, 0, 1, 0, 0, 0, milliseconds)
This overload of new Date() expects a local time so doing this instead will initialise the date to midnight local time and the constructor gracefully handles a millisecond value greater than 999 by just incrementing the other parts of the date by the correct amount. So in this case the date being held is 1970-01-01T00:00:00+-<offset> and getHours() will return 0 for any system timezone and any Chrome version.
- [1] Rik Driever's post
- https://medium.com/@rikdriever/javascript-date-issue-since-chrome-67-50aa555799d0
- [2] Implement a new spec for timezone offset calculation
- https://chromium-review.
googlesource.com/c/v8/v8/+/572148 - [3] The ECMA spec change
- https://github.com/tc39/
ecma262/pull/778
No comments:
Post a Comment