Reputation: 245
I am currently trying to get familiar with JavaScript through solving the Project Euler tasks. In the following snippet, I am trying to get the sum of all even fibonacci numbers below 4.000.000.
var sum = 0;
var fibNums = [1, 2];
for (var i = 2; fibNums[i] < 4000000; i++) {
fibNums[i] = fibNums[i - 1] + fibNums[i - 2];
sum += fibNums[i] % 2 == 0 ? fibNums[i] : 0;
}
console.log(sum);
My problem is that fibNums[i] < 4000000
evaluates to false, even on the first run. How does that come?
Upvotes: 0
Views: 67
Reputation: 3437
var sum = 0;
var fibNums = [1, 2, 0];
for (var i = 2;fibNums[i] < 4000000; i++) {
fibNums[i] = fibNums[i-1] + fibNums[i-2];
sum += fibNums[i] % 2 == 0 ? fibNums[i] : 0;
fibNums.push(fibNums[i]);
}
alert(sum);
Upvotes: 0
Reputation: 145458
In short: (fibNums[i = 2] === undefined)
, whereas (undefined < 40000 === false)
.
Longer:
You start your for
loop with var i = 2
, while fibNums
array contains only 2 elements and has indexes starting with 0
. Hence, fibNums[2]
will be undefined
. In JavaScript undefined
is never less than a number, as when you compare any number with undefined
it will always be false
.
Upvotes: 4