Juan Garcia
Juan Garcia

Reputation: 855

Why the second operation is faster than the first?

I'm trying to see the performance difference between for and reduce over numeric array and I can see that always the second function that I'm measuring (whatever is, for or reduce) is faster than the first. I'm guessing that it is something related to data caching or threadpool size using node. This is the code:

process.env.UV_THREADPOOL_SIZE = 1;

let array = [
  1,
  23,
  4,
  5,
  6,
  7,
  8,
  7,
  65,
  4,
  3,
  23,
  43,
  2,
  23,
  32,
  23,
  23,
  234,
  243,
  423,
  432,
  43,
  23,
  2,
  23,
  2,
  23,
];

let sum = 0;
console.time('reduce');
sum = array.reduce((s, p) => (s += p), 0);
console.timeEnd('reduce');

sum = 0;
console.time('for');
for (let i = 0; i < array.length; i++) {
  sum += array[i];
}
console.timeEnd('for');

And this code shows different results:

process.env.UV_THREADPOOL_SIZE = 1;

let array = [
  1,
  23,
  4,
  5,
  6,
  7,
  8,
  7,
  65,
  4,
  3,
  23,
  43,
  2,
  23,
  32,
  23,
  23,
  234,
  243,
  423,
  432,
  43,
  23,
  2,
  23,
  2,
  23,
];

let sum = 0;
console.time('for');
for (let i = 0; i < array.length; i++) {
  sum += array[i];
}
console.timeEnd('for');

sum = 0;
console.time('reduce');
sum = array.reduce((s, p) => (s += p), 0);
console.timeEnd('reduce');

I mean, if you reverse the order of execution, the measured results are different.

To do the test I'm using node v11.11.0

Any idea about it?

EDIT: I'm not looking for explanation why reduce is faster than for or something like that. I want to know why nodejs produce that results in this sequence of operations.

Upvotes: 2

Views: 384

Answers (5)

F.bernal
F.bernal

Reputation: 2694

After testing both there is no great time difference between them.

In the link, you can test it modifying the number of executions.

https://repl.it/@statefull/TrustworthyDefiniteArraylist

After some tests, the problem is on console.time function.

See this:

https://repl.it/@statefull/WrathfulCostlyIrc

The first time console.time is called it takes more time. It is compared with Date.now for each execution.

Some more tests reveal that until the first console.timeEnd the time measurement of the first console.timeEnd is not the real one.

See: https://repl.it/@statefull/SoggyLimegreenUser

Upvotes: 1

David
David

Reputation: 10708

This is a common occurrence in scripted or JIT compiled languages, and has to do with the overhead of the compiler slowing down your operation, but only the first time. After that first time, your calling compiled code, rather than compiling a script, though this depends on how the execution engine is implemented. This is why testing generally demands that you do something not once, but several (thousand, ideally) times

Upvotes: 0

Jonas Wilms
Jonas Wilms

Reputation: 138557

I mean, if you reverse the order of execution, the measured results are different.

That means: Your test is somehow flawed or the results are so random, that you can't make a judgement based on them.

Run the test more often (a few thousand times), then take the average time (through that you average out the influence of other code pieces (you are running on a multithreaded machine), and you force the engine to choose it's most powerful optimization).

Before that, no judgement can be made wether one of them is faster. The result will most likely be: It does not matter, both are fast enough.

Worth reading: Which is faster? - Eric Lippert

Upvotes: 2

colby brooks
colby brooks

Reputation: 37

Map/Reduce/Filter/Find are slow because they have callback functions, these add overhead

Upvotes: -2

Bradd
Bradd

Reputation: 196

You add a function to the stack for each iteration of reduce (() => {} is a new function call). These function calls add a little extra time to the overall process.

As long as time constraints aren't very strict or the arrays very large, the increase to readability is generally worth it.

Upvotes: 0

Related Questions