Reputation: 149
I am trying to make a typed array from a SharedArrayBuffer.
let count = 4500;
let sab = new SharedArrayBuffer(count);
let arr = new Int16Array(sab);
now while running the code I noticed that all the components of the array were not getting processed. when I console logging the lengths of the array I got the following output:
// when arr = new Int8Array(sab);
>> 4500
// when arr = new Int16Array(sab);
>> 2250
//when arr = new Int32Array(sab);
>> 1125
If its not a Int8Array its giving a wrong output. I also tried this with different values of count and found the wrong outputs on all of them. I am also not getting any errors in the console.
Upvotes: 0
Views: 377
Reputation: 6119
This is exactly how the Int8Array
, Int16Array
, Int32Array
, and Float64Array
classes are built to function. You can see in the image on MDN's docs here on Typed Arrays that these arrays what is called an Array Buffer.
Each depth has items of twice the size, 16 bytes vs. 8 bytes and so on, and thereby half the length. The base number is 8. So when you set a count of 4500
, Int8Array will reflect that same number, but each depth past that (Int16Array
, Int32Array
, and Float64Array
) will reduce that count by half with each step.
You can see this in action, and the Float64Array
function here:
let count = 9000;
let sab = new SharedArrayBuffer(count);
let sab_depths = [
['Int8Array', new Int8Array(sab)],
['Int16Array', new Int16Array(sab)],
['Int32Array', new Int32Array(sab)],
['Float64Array', new Float64Array(sab)]
];
console.log(`Count: ${count}`)
sab_depths.forEach(([depth, array]) => console.log(`${depth}: ${array.length}`));
Upvotes: 2