Reputation: 148744
I know that there is a trick to make a decimal number into an integer using x|0
:
for example :
3.14|0
-->3
There is no rounding involved here :
3.999 |0 -->
3`
Question
However : why does
9.99999999999|0
yields 9
while
9.9999999999999999|0
yields 10
?
Upvotes: 0
Views: 83
Reputation: 5105
I would need to look in the source code of firefox, nonetheless it is interesting to try some samples:
>>> 9.99999999999999999.toString(2)
"1010"
>>> 9.9999999999999999.toString(2)
"1010"
>>> 9.999999999999999.toString(2)
"1001.1111111111111111111111111111111111111111111111111"
>>> 9.999999999999999.toString(2).length
54
>>> 9.99999999999999.toString(2)
"1001.111111111111111111111111111111111111111111111101"
>>> 9.99999999999999.toString(2).length
53
Rough estimate: about 49 bits go into the mantissa which leaves 15 bits for sign and exponent.
Update: I found an interesting entry in the comp.lang.javascript FAQ, see http://www.jibbering.com/faq/#FAQ4_7 It limits the precision of a double type to 53 bits.
I think the precision is exhausted during the step when it converts the double into an 32-bit int prior to the bitwise logical or operation.
Upvotes: 0
Reputation: 27853
9.9999999999999999
has too many decimals and loses precision in Javascript representation, becoming 10. You can test this:
9.9999999999999999 === 10
will be true
Upvotes: 2