Reputation: 89
I executed one program and when I calculated the time elapsed I found that time is not constant. It's varying under some range. I wanted to know why is it so?
Upvotes: 2
Views: 166
Reputation: 881653
You often find this sort of behaviour when you measure elapsed times. That's because elapsed times depends on all of the other things that your computer may be doing.
See for example:
pax> time sleep 1
real 0m1.012s
user 0m0.004s
sys 0m0.000s
pax> time sleep 1
real 0m1.002s
user 0m0.004s
sys 0m0.000s
pax> time sleep 1
real 0m1.007s
user 0m0.004s
sys 0m0.000s
In all those cases, the elapsed real
time varies but the actual use of the processor, user+sys
is remarkably consistent.
For timing, you should use the most accurate measurement you can (such as user+sys
) to remove external influences. You should also use statistical techniques to get a more accurate picture.
For example, I tend to do twelve runs, throw away the outliers (fastest and slowest), then average the remaining ten.
Upvotes: 2
Reputation: 85468
Because there is a number of processes running concurrently with your application.
Even if you managed to isolate your application completely, there are no guarantees that the same code will run at the same speed every time. That's why you should really on averages over multiple runs if you are testing performance (assuming that's what you are doing here).
If you are measuring efficiency, there are more objective/formal ways of defining it:
See: Big-O notation
Upvotes: 6