Mutu Yolbulan
Mutu Yolbulan

Reputation: 1052

How many bits does JavaScript use to represent a number?

How many bits does JavaScript use to represent a number?

Upvotes: 2

Views: 1776

Answers (3)

HBP
HBP

Reputation: 16033

From the referenced spec:

The Number type has exactly 18437736874454810627 (that is, 264−253+3) values, representing the doubleprecision 64-bit format IEEE 754 values as specified in the IEEE Standard for Binary Floating-Point Arithmetic, except that the 9007199254740990 (that is, 253−2) distinct "Not-a-Number" values of the IEEE Standard are represented in ECMAScript as a single special NaN value. (Note that the NaN value is produced by the program expression NaN.) In some implementations, external code might be able to detect a difference between various Not-a-Number values, but such behaviour is implementation-dependent; to ECMAScript code, all NaN values are indistinguishable from each other.

That said be aware that when using the bit operators &, ^, >> << etc only the least significant 32 bits are used and the result is converted to a signed value.

Upvotes: 1

Gabe
Gabe

Reputation: 86718

Generally JS implementations use 64-bit double-precision floating-point numbers. Bitwise operations are performed on 32-bit integers.

Upvotes: 4

Quentin
Quentin

Reputation: 943569

That depends on the specific implementation, not the language itself.

If you want to know what range of numbers is supported, then see section 8.5 (The Number Type) of the specification.

Upvotes: 3

Related Questions