Reputation: 2860
I have the following code:
const uint8 = new Uint8Array(buffer);
let uint8String = "";
for (const n of uint8) uint8String += n.toString(2).padStart(8, "0");
console.log(uint8String);
const uint32 = new Uint32Array(uint8.buffer);
let uint32String = "";
for (const n of uint32) uint32String += n.toString(2).padStart(32, "0");
console.log(uint32String);
Which logs:
10111001111000011100010000100001001111110101100001010101101010001011100001011101011000101000001000111110010110000101010110101000
00100001110001001110000110111001101010000101010101011000001111111000001001100010010111011011100010101000010101010101100000111110
Which are different binary strings. What’s the reason for the difference?
Upvotes: 0
Views: 251
Reputation: 180808
Because Javascript on an Intel machine is Little Endian.
Consider the first four bytes of your two arrays:
10111001 11100001 11000100 00100001 <-- Four unsigned bytes
00100001 11000100 11100001 10111001 <-- One unsigned int
Notice that the bytes are swapped end-to-end? That's because the byte order in the first array is in the order in which you placed the characters, but the second array stores each unsigned int as Little Endian, which means the Least Significant Byte (LSB) is stored at the smallest memory location.
Further Reading
Endianness on Wikipedia
Upvotes: 2