Reputation: 12129
When I try to Date.parse()
an integer or string 0
, it returns 946681200000, which translates to a date of:
Sat Jan 01 2000 00:00:00 GMT+0100 (CET)
I would assume that the parser interprets the single zero as a year 2000, but the specs say nothing about single-character year definition - both RFC 2822 and ISO 8601 require a four-character year in the string.
I would like to better understand how the string '0' is parsed into a Date, why is it accepted as a valid Date (should it not be NaN
or some such?) and why the year 2000 is chosen instead of for example 1900.
Update
After some trial & error, I discovered that the single number is in fact interpreted differently in different numeric ranges.
NaN
Upvotes: 6
Views: 771
Reputation: 415
As Bergi correctly points out, the spec leaves it up to the implementation to return a date when it is not one of the standard formats.
Here's how its implemented in Chromium's V8 engine:
if (!is_iso_date_) {
if (Between(year, 0, 49)) year += 2000;
else if (Between(year, 50, 99)) year += 1900;
}
On Chrome 41.0.2272.76
:
Date.parse(0)
returns 946665000000
which is Sat Jan 01 2000 00:00:00
Date.parse(49)
returns 2493052200000
which is Fri Jan 01 2049 00:00:00
Date.parse(50)
returns -631171800000
which is Sun Jan 01 1950 00:00:00
(The translated times are GMT+5.30
; The millisecond value will change depending upon your timezone)
Firefox returns NaN
for all of these cases.
Upvotes: 0
Reputation: 665122
the specs say nothing about single-character year definition
The spec says:
If the String does not conform to that format the function may fall back to any implementation-specific heuristics or implementation-specific date formats.
For V8 specifically, see this bug report on unpredictable results when called with a single number. You can also read the source directly (dateparser.cc, dateparser.h, dateparser-inl.h).
Upvotes: 8