Reputation: 85792
PHPBench.com runs quick benchmark scripts on each pageload. On the foreach test, when I load it, foreach takes anywhere from 4 to 10 times as long to run than the third example.
Why is it that a native language construct is apparently slower than performing the logic oneself?
Upvotes: 3
Views: 12618
Reputation: 13427
I'm sorry, but the website got it wrong. Here's my own script that shows the two are almost the same in speed, and in fact, foreach is faster!
<?php
function start(){
global $aHash;
// Initial Configuration
$i = 0;
$tmp = '';
while($i < 10000) {
$tmp .= 'a';
++$i;
}
$aHash = array_fill(100000000000000000000000, 100, $tmp);
unset($i, $tmp);
reset($aHash);
}
/* The Test */
$t = microtime(true);
for($x = 0;$x<500;$x++){
start();
$key = array_keys($aHash);
$size = sizeOf($key);
for ($i=0; $i<$size; $i++) $aHash[$key[$i]] .= "a";
}
print (microtime(true) - $t);
print ('<br/>');
$t = microtime(true);
for($x = 0;$x<500;$x++){
start();
foreach($aHash as $key=>$val) $aHash[$key] .= "a";
}
print (microtime(true) - $t);
?>
If you look at the source code of the tests: http://www.phpbench.com/source/test2/1/ and http://www.phpbench.com/source/test2/3/ , you can see that $aHash isn't repopulated to the initial data after each iteration. It is created once at the beginning, then each test is ran X times. In this sense, you are working with an ever growing $aHash for each iteration... in psuedocode:
iteration 1: $aHash[10000000000000]=='aaaaaa....10000 times...a';
iteration 2: $aHash[10000000000000]=='aaaaaa....10001 times...a';
iteration 2: $aHash[10000000000000]=='aaaaaa....10002 times...a';
Over time, the data for all the tests is getting larger for each iteration, so of course by iteration 100, the array_keys method is faster because it'll always have the same keys, where as the foreach loop has to contend with an ever growing data set and store the values in arrays!
If you run my code provided above on your server, you'll see clearly that foreach is faster AND neater AND clearer.
If the author of the site intended his test to do what it does, then it certainly is not clear, and otherwise, it's an invalid test.
Upvotes: 3
Reputation: 31813
Benchmark results for such micro measurements, coming from a live, busy webserver that is subject to extreme amounts of varying load and other influences, should be disregarded. This is not an environment to benchmark in.
Upvotes: 0
Reputation: 401002
Maybe it has to do with the fact that foreach works on a copy of the array ?
Or maybe it has to do with the fact that, when looping with foreach, on each iteration, the internal array pointer is changed, to point to the next element ?
Quoting the relevant portion of foreach
's manual page :
Note: Unless the array is referenced, foreach operates on a copy of the specified array and not the array itself. foreach has some side effects on the array pointer.
(I would also say that this kind of micro-optimization will not matter at all in a real application -- but I guess you already know that, and just asked out of curiosity)
There is also one thing that doesn't feel right in this test : it only does the test one time ;; for a "better" test, it might be useful to test all of those more than once -- with timings in the order of 100 micro-seconds, not much is required to make a huge difference.
(Considering the first test varies between 300% and 500% on a few refreshes...)
foreach($aHash as $key=>$val) {
$aHash[$key] .= "a";
}
And the third one (100%) :
$key = array_keys($aHash);
$size = sizeOf($key);
for ($i=0; $i<$size; $i++) {
$aHash[$key[$i]] .= "a";
}
Upvotes: 9