Reputation: 9854
What is an accurate way to measure the performance of a javascript engine like v8 or spidermonkey ? It should at least have not very high deviations from one evaluation and another, probably allow to rank between different javascript engines on different operating systems and different hardware configurations.
My first attempt was this in a web page with nothing on it, I loaded that page in web browsers. Then I tried executing this code in Google Chrome's javascript console and it came out very different as you'll see in the results:
mean = function (distr) {
var sum = 0;
for (obs in distr) {
sum += distr[obs];
};
return sum / distr.length;
};
stdev = function (distr,mean) {
var diffsquares = 0;
for (obs in distr) {
diffsquares += Math.pow(distr[obs] - mean , 2);
};
return Math.sqrt((diffsquares / distr.length));
};
var OPs = 1000000;
var results = [];
for (var t = 0; t < 60; t++) {
var start = (new Date()).getTime();
for(var i = 0.5; i < OPs; i++){
i++;
}
var end = (new Date()).getTime();
var took = end - start;
var FLOPS = OPs/took;
results.push(FLOPS);
};
average = mean(results);
deviation = stdev(results,average);
console.log('Average: '+average+' FLOPS. Standart deviation: '+deviation+' FLOPS');
And it replied:
NodeJS 0.5.0
Chrome 13.0.782.112 (From the Console (Ctrl+Shift+J))
Chrome 13.0.782.112 (as a webpage)
Firefox 6.0
Opera 11.50
Something strange happened. The benchmark in Chrome on the console took a lot more time than the ones in other browsers and NodeJS. I mean something like 30 seconds on Chrome versus 2 on others. The standart deviations in Chrome on the console are also very small compared to others. Why this huge difference between executing the code on the console and executing code in a webpage ?
If this is all too stupid let me remind you that I "learned" javascript (and to code in general) by myself and not very long ago, so I suck at a lot of things.
What is a good measure of this ? I'd like to focus on speed of math operations and not other things like regex speed. What do you recomend ? I also tryied generating 10x10 matrixes of floating point numbers and multiplying them lots of times, the result comes every time either 7, 8 or 9 M FLOPS, but mostly 7 on Chrome, if it's not stupid at all and someone wants the code I'm happy to pastebin it.
Upvotes: 4
Views: 1297
Reputation: 35064
The Chrome console has a "weird" execution environment that's not quite the web page itself and incurs some performance costs due to that, I would think. That's certainly true for the console in Firefox.
To answer your original question... it really depends on what you want to measure. Different JS engines are good at different things, so depending on the test program you could have Chrome being 5x faster than Firefox, say, or vice versa.
Also, the optimizations browser JITs do can be very heavily dependent on the overall code flow, so the time it takes to do operation A followed by operation B is in general not the same as the sum of the times needed to do A and B separately (it can be much larger, or it can be smaller). As a result, benchmarking anything other than the code you actually want to run is of very limited utility. Running any single piece of code is nearly useless for "ranking web browsers according to performance".
Upvotes: 0
Reputation: 8472
JS performance optimization is a huge area in general, and it's rather ambitious to start from scratch.
If I were you, I'd take a look at some existing projects around this space:
Upvotes: 7