Reputation: 6806
Something weird with uint32 math :
a = new Uint32Array([1103515245, 1103527590, 0]);
a[2] = a[0] * a[1];
console.log(a[2]);
Gives 2524872960, which is wrong I think, because in C :
#include <stdio.h>
int main() {
unsigned a[3] = { 1103515245, 1103527590, 0 };
a[2] = a[0] * a[1];
printf("%u\n", a[2]);
return 0;
}
Gives 2524872878 (and so does Windows Calculator)
So what's up with that? I'm using Firefox 45.0.1
EDIT: Oh and if that is expected, then how do I duplicate the "C" result?
Upvotes: 2
Views: 544
Reputation: 288280
That's because JS doesn't have an Integer type. So *
does the multiplication in 64 bit floats. But these don't have enough precision for the result:
/* Exact math */ 1103515245 × 1103527590 = 1217759518843109550
/* JavaScript */ 1103515245 * 1103527590 === 1217759518843109600
Number.MAX_SAFE_INTEGER === 0009007199254740991
If you want to do integer multiplication, use Math.imul
:
The
Math.imul()
function returns the result of the C-like 32-bit multiplication of the two parameters.
a[2] = Math.imul(a[0], a[1]); // -1770094418
a[2]; // 2524872878
Upvotes: 4