Radek Kysely
Radek Kysely

Reputation: 110

Node.js: 100M elements in 1D vs 2D Array — Unexpected Memory Issue

Let's say I have 100 million random floats I'm trying to save in an array.

At first, I was saving the numbers in a 2D array (basically a matrix), but then I thought maybe I could make the whole thing faster if I stored all 100M elements in a single linear 1D array.

To my surprise, this happened:

10 000 × 10 000 2D Array

Creation and manipulation of the array is quite fast, Node runs without any troubles even on default memory limit.

100M—element 1D Array

When trying to generate 100M random floats Node throws
FATAL ERROR: invalid array length Allocation failed - JavaScript heap out of memory
even when called with --max_old_space_size=8192

Why is that?

It seems very counter-intuitive to me. My guess was making a single long Array object should be much more memory-efficient than storing 10k Arrays in another Array.

So, yeah, my question is: why is that?

Upvotes: 1

Views: 572

Answers (1)

a p
a p

Reputation: 3208

This changes in implementation from version to version, but in general there are hard limits placed on the size of an individual object. See https://bugs.chromium.org/p/v8/issues/detail?id=3505 for details on that.

In the first case, you essentially have an array of 10000 pointers to other arrays, each only of size 10000 themselves. They don't need to be contiguous in memory - unlike the second example where you have an actual block of 100000000 in a single object. So you hit a (totally arbitrary) cap set by the Node runtime (as the link above and comment below point out, this is a facet of V8, not Node specifically).

Upvotes: 2

Related Questions