Reputation: 1146
I was looking at Number.isInteger()
on MDN. I came upon this example -
Number.isInteger(5.000000000000001); // false
I changed the value to 5.0000000000000001
and if I execute the same method on the console, the result is:
Number.isInteger(5.0000000000000001); // true
Also 5.0000000000000001 === 5
comes out to be true
.
Can someone please explain what's happening here?
Upvotes: 3
Views: 311
Reputation: 222273
JavaScript uses the IEEE-754 basic 64-bit binary floating-point format. In this format, every finite number is expressed as an integer M times a power of 2, 2E. The magnitude of M must be less than 253, and E may be from −1074 to 971. (The more common description scales M and E differently, so that M has fraction bits instead of being an integer, but this is equivalently mathematically.)
For the number 5.000000000000001, the most precisely it can be represented is to use E = −50, because this makes M as large as will fit, about 5•250 = 1.25•252. If we made E one smaller, M would have to be about 1.25•253, which exceeds the 253 limit.
So, given E = −50, we want 5.000000000000001 = M•2−50, which means M = 5.000000000000001•250 = 5•250 + .000000000000001•250. Obviously the first term, 5•250, is an integer. The second term, .000000000000001•250, is about 1.1259. So, when converting 5.000000000000001 to JavaScript’s Number format, we need to make M an integer, so we have to round 1.1259 to 1. The result is M = 5•250 + 1, so the number is (5•250 + 1)•2−50 = 5.00000000000000088817841970012523233890533447265625.
On the other hand, when we have 5.0000000000000001, with one more zero, we have 5.0000000000000001•250 = 5•250 + .0000000000000001•250. In this case, the second term, .0000000000000001•250, is about .11259. So, when converting 5.0000000000000001 to JavaScript’s Number format, we have to round .11259 to 0, so the number is ((5•250)•2−50 = 5.
Upvotes: 3