Reputation: 2633
I have two arrays like this:
const a = ['a', 'b', 'c', 'd'];
const b = ['1', '2', '3', '4'];
I'm trying to make a new array like this:
const c = ['a1', 'b2', 'c3', 'd4'];
I tried it this way:
const c = [];
c.push([`${a[0]}${b[0]}`, `${a[1]}${b[1]}`, `${a[2]}${b[2]}`, `${a[3]}${b[3]}`]);
With actually looping through data and doing this took 17400ms.
I took out the c.push([........]);
and it dropped to 1250ms.
Why does this take so long to do?
And what is the best way to do this?
Upvotes: 4
Views: 2602
Reputation: 11502
you can use .map
to achieve that. map a
, then use index on each loop to get element of b
.
const a = ['a', 'b', 'c', 'd'];
const b = ['1', '2', '3', '4'];
var c = a.map(function (d, i) {
return d + String(b[i])
})
console.log(c)
// ["a1", "b2", "c3", "d4"]
cleaner code using es6:
var c = a.map((d, i) => `${d}${b[i]}`)
Upvotes: 3
Reputation: 287990
As I suspected and you confirmed, the real problem is that you do too many push
.
push
modifies the length of the array. The specification does not enforce any data structure for arrays, but for non-sparse ones, implementations usually use lists which store the values consecutively in memory. That's problematic when you change the length of the array, because the additional data could not fit in the place in memory where the data currently is, so all data must be moved. push
ends up being constant in amortized time instead of just constant.
However, if you know the length of the resulting array beforehand, it's usually better to preallocate.
Then, instead of
var array = [];
for(var i=0; i<2e4; ++i)
array.push([a[0]+b[0], a[1]+b[1], a[2]+b[2], a[3]+b[3]]);
I would use
var array = new Array(2e4);
for(var i=0; i<2e4; ++i)
array[i] = [a[0]+b[0], a[1]+b[1], a[2]+b[2], a[3]+b[3]];
Upvotes: 1
Reputation: 18555
A simple loop.
const a = ['a', 'b', 'c', 'd', 'e'];
const b = ['1', '2', '3'];
var result = [];
for (var i = 0; i < a.length; i++) {
result[i] = a[i] + b[i];
}
alert(result);
Upvotes: 2