Reputation: 47290
Looking over the previous questions and answers it appeared this should work :
var palindrome = new Date('2011-11-11');
var december = new Date('2011-11-11');
december.setDate(palindrome.getDate()+20);
//should be december, but in fact loops back over to Nov 1st)
my jsFiddle
is there a simple way to ensure that months are incremented correctly, or have I missed something obvious ?
Upvotes: 1
Views: 14808
Reputation: 79032
Your code is correct, however you are converting it to a string wrongly.
getMonth()
starts with 0 as January, and ends with 11 as December. So all you need to do is add 1 to the month like this:
alert(endDate.getFullYear() + "-" + (endDate.getMonth()+1) +"-"+ endDate.getDate());
Notice the additional brackets - cos you are performing a math operation while concatenating strings. You won't want to end up with "101" as the month.
To see whether you got the date correct, use endDate.toDateString()
to display the date in a fully qualified name (i.e.: January - December).
alert(endDate.toDateString());
For more info on the Date object, check out this section in w3schools
Upvotes: 1
Reputation: 214
You could do it like this:
var dayOffset = 20;
var millisecondOffset = dayOffset * 24 * 60 * 60 * 1000;
december.setTime(december.getTime() + millisecondOffset);
Upvotes: 6
Reputation: 532465
The getMonth()
call returns a value between 0 and 11, where 0 is January and 11 is December, so 10 means November. You need to increment the value by 1 when using it in a string. If you simply output it as a string you'll see that it has the correct date. Note I also had to change the starting date format. It didn't seem to like 2011-11-11
so I made it 11/11/2011
. http://jsfiddle.net/9HLSW/
Upvotes: 4