Rachel
Rachel

Reputation: 132618

Why does Javascript evaluate a 2-digit year of 00 as 1900 instead of 2000?

I have an old web app where Javascript is used to validate some dates. Users usually use 2-digit years and I recently discovered it was evaluating 00 as 1900 instead of 2000

if (new Date(tb[0].value) > new Date(tb[1].value)){
    alert('Starting date must come before the ending date');
    tb[0].focus();
    return false;
}

Entering 1/1/99 in the first box and 1/1/00 in the 2nd will cause an error message saying the start date has to be before the end date because 99 is evaluating at 1999 while 00 is evaluating at 1900.

Of course, Users can get around this using 4-digit years, but I still want to know what can be done to get Javascript to evaluate 2-digit years correctly.

So my question is, how can I get Javascript to evaluate 00 as 2000 and not 1900?

Upvotes: 22

Views: 19262

Answers (5)

tstrand66
tstrand66

Reputation: 968

is there a reason you couldn't do something along these lines? The big assumption being that if the user is entering a 2 digit year that its probably not intended to be over 100 years in the past.

myDate('2-1-00');

function myDate(date) {
  let today = new Date();
  date = new Date(date.split('-').join('/'));
  if ((today.getFullYear() - date.getFullYear()) >= 100) {
    date.setFullYear(date.getFullYear() + 100);
  }
  alert(date);
}

Upvotes: 0

ERR
ERR

Reputation: 51

Chrome actually handles this correctly, but IE and Firefox (at least) do not. Here's my solution:

var x = new Date(input);                //parse the date initially

if (x!="Invalid Date")  {

    var n = input.split(/[/-]/);        //regex to look for / or - delimited dates

    if (n[2].length == 2)               //if the input has a 2 digit year 
    {
        var y = x.getFullYear();
        if (y < 1950)                   //and the parser decided it's before 1950
            x.setFullYear(y + 100);     //add a century
    }
}

output = dateToShortString(x);           //custom function to standardize formatting

Upvotes: 5

Matt H
Matt H

Reputation: 6530

The way I've done this in the past is to select an arbitrary year that lets the code assume that 2 digit years prior to that arbitrary year are in the 1900's, while years after that are in the 2000's. For an accounting app I had to make Y2K compliant, if I recall correctly, I chose 1940. So transactions between 40-99 were 1940-1999, and transactions 00-39 were 2000-2039.

Upvotes: 1

Vala
Vala

Reputation: 5674

The simplest way is just to accept it does it that way and check for it.

if (date.getFullYear() < 1970) {
    date.setFullYear(date.getFullYear() + 100);
}

1970 is of course just an example value as you have to have a sensible break point. You may want to do that as current year - x instead of a constant of course.

Upvotes: 23

Pointy
Pointy

Reputation: 413915

It does that because the language was created in the 1990's (and in a hurry). You can use getFullYear() and setFullYear() to handle years in a non-goofy way.

What I've done is write some code to check for year values less than 100, and if it's greater than 90 (or something similarly appropriate, depending on the situation) assume it's in the 20th century, otherwise assume the 21st.

And @Rachel no there's no way to tell the runtime library to behave differently, at least not any standardized way. That's just how the Date code works.

Upvotes: 22

Related Questions