Reputation: 131
So basically I'm trying to find a memory-efficient way to copy an array. Suppose we have arr1
which contains 500000 elements of value true
:
var arr1 = [];
for (var i = 0; i < 500000; i++) {
arr1[i] = true;
}
In Node.js, that occupies about 32.3 MiB (including 18.7 MiB when Node starts). Now, obviously when you make a reference to arr1
, no memory will be allocated:
var arr2 = arr1;
Now, when I perform a copy of arr1
into arr2
:
var arr2 = arr1.concat();
The process occupies 36.2 MiB, so about 4 MiB.
Here's the thing: no matter what I do to empty or wipe the original array, the memory allocated to that array won't get freed or picked up by the garbage collector. Suppose I have:
arr1.length = 0;
delete arr1;
arr1 = undefined;
arr1 = arr2.concat();
Thanks to that, the process now occupies 39.8 MiB.
So what is really happening here? Is there some secret reference to the original array that Node (or whatever JS engine out there) is trying to hide from me? Here's further code:
arr2.length = 0;
delete arr2;
arr2 = undefined;
arr2 = arr1.concat();
Which will simply "empty" arr2
so it can hold a copy of arr1
. As you may have figured out, I'm attemping to transfer the array's contents back and forth, but now the process occupies 43.5 MiB. If this was a large array, memory intake would be huge. Is there a way to do this, taking memory efficienty into account?
Upvotes: 1
Views: 385
Reputation: 254916
Your profiling technique is not correct.
I created an array the same way you do and created "a clone" with the same .concat()
method as you do, and here are the results
So as you can see it's the same array (that takes just ~2.06Mb) retained by 2 references.
The corresponding jsfiddle: http://jsfiddle.net/6o0h0r1j/
Relevant reading:
To summarize: your assumptions are wrong from the very beginning.
Upvotes: 1