Reputation: 153
I found several articles on the Internet and questions here about how to measure the code performance in Node. But I got results that differ by about two times, depending on the measurement tool. Target function is:
let arr = [75, 283, 361, 168, 23, 489, 421, 95, 72, 380, 384, 470, 235, 465, 276, 26, 364, 416, 373, 184, 211, 239, 485, 18, 19, 252, 447, 6, 291, 324, 497, 352,
458, 201, 238, 116, 333, 163, 207, 417, 340, 431, 5, 269, 258, 178, 182, 295, 257, 434, 37, 372, 154, 223, 313, 80, 71, 229, 379, 181, 396, 281, 491, 58, 254,
359, 79, 175, 143, 214, 217, 148, 393, 246, 34, 166, 251, 381, 413, 180, 338,
442, 494, 378, 123, 118, 395, 446, 459, 472, 457, 51, 127, 351, 389, 157, 260,
370, 405, 346]
const { performance } = require("perf_hooks")
function summBrute(arr, k) {
for(let i = 0; i < arr.length; i++) {
for(let j = i + 1; j < arr.length; j++) {
if(arr[i] + arr[j] == k) {
return true;
}
}
}
return false;
}
And measurement methods are:
console.time('summBrute')
summBrute(arr, 394)
console.timeEnd('summBrute')
var t0 = performance.now()
summBrute(arr, 394) //
var t1 = performance.now()
console.log("Call to summBrute took " + (t1 - t0) + " milliseconds.")
Here I want to find out whether there are two numbers in the array in question, add which, I get the second argument of the called function. I use these two methods independently of each other, just commenting out the corresponding section of the code. console.time()
gives an average 0.300ms performance score console.time result
and performance.now()
gives 0.170ms performance.now result
Please help me to understand why particular THESE two methods give different results (almost doubled)? I am using Node v15, CPU Core i5, 8GB RAM, Win10.
Upvotes: 13
Views: 15135
Reputation: 13195
If you put the measurements in a small loop, you will see that they are barely consistent with themselves.
for(let i=0;i<5;i++){
console.time('summBrute')
summBrute(arr, 394)
console.timeEnd('summBrute')
var t0 = performance.now()
summBrute(arr, 394) //
var t1 = performance.now()
console.log("Call to summBrute took " + (t1 - t0) + " milliseconds.")
}
Produces
summBrute: 0.234ms Call to summBrute took 0.1938999891281128 milliseconds. summBrute: 0.171ms Call to summBrute took 0.13520002365112305 milliseconds. summBrute: 0.107ms Call to summBrute took 0.1332000494003296 milliseconds. summBrute: 0.095ms Call to summBrute took 0.10259997844696045 milliseconds. summBrute: 1.385ms Call to summBrute took 0.10839998722076416 milliseconds.
they are almost random numbers, useless.
However if I call summBrute()
10 million times:
for(let i=0;i<5;i++){
console.time('summBrute')
for(let j=0;j<10000000;j++)
summBrute(arr, 394)
console.timeEnd('summBrute')
var t0 = performance.now()
for(let j=0;j<10000000;j++)
summBrute(arr, 394) //
var t1 = performance.now()
console.log("Call to summBrute took " + (t1 - t0) + " milliseconds.")
}
The results stop depending on the mood of my computer and become much more consistent:
summBrute: 16.369s Call to summBrute took 16184.267899990082 milliseconds. summBrute: 15.643s Call to summBrute took 15852.86260008812 milliseconds. summBrute: 16.355s Call to summBrute took 15942.392500042915 milliseconds. summBrute: 16.190s Call to summBrute took 16314.965299963951 milliseconds. summBrute: 16.523s Call to summBrute took 16744.983800053596 milliseconds.
It still has a spread of almost half seconds, but it's pretty visible that the average (of 10 million calls) is around 16 seconds, and thus one call is around 0.016 ms, which is clearly not what any of the individual calls measured.
TL;DR: time is time, it matters far more what you measure than the timer API you pick.
Upvotes: 17