Skrishna
Skrishna

Reputation: 41

Why is there a difference in execution time while running the same program multiple times?

Probably a dumb question. I'm noticing a difference in execution time while running a simple Hello World program in C on a Linux machine( It's not language specific though).

Program:

#include<stdio.h>
#include<time.h>

int main()
{
    clock_t begin, end;
    double time_spent;

    begin = clock();

    printf("%s", "Hello World\n");
    end = clock();
    time_spent = (double)(end - begin) / CLOCKS_PER_SEC;
    printf("%f\n", time_spent);
    return 0;
}

o/p:

$ ./hello 
Hello World
0.000061
$ ./hello 
Hello World
0.000057
$ ./hello 
Hello World
0.000099 

This is tested on a quad core machine with a load average of 0.4 and enough free memory. Though the difference is pretty small, what could be the reason behind it?

Upvotes: 4

Views: 1500

Answers (3)

user3344003
user3344003

Reputation: 21607

Two major causes:

  1. Disk caching: Once your executable is loaded the first time, it may remain in memory so that the the subsequent runs do not require a fetch from disk.

  2. System activity: What else is your processor doing at the same time that are consuming resources (memory, CPU, disk access).

Upvotes: 0

John Burger
John Burger

Reputation: 3672

The easy answer is: what is happening in the rest of the system.

There are all of these background processes that do 'stuff': process network packets; save or log data to the disk; decide to wake up and check the current network time; who knows! For such a short time interval as your code, those tiny things can make a large difference. Try doing the loop 1,000 times and checking those results. Of course, output to the screen involves graphics, updates, other programs... maybe you should just do a:

unsigned i, j;
...
// Wait a LONG time!
for (i=0;i<5u;++i) { // 5 is about a minute on my machine
    for (j=0;j<~0u;++j) {
        // Twiddle thumbs!
    } // for
} // for

inside your timing.

Upvotes: 1

dbush
dbush

Reputation: 223689

Unless you're running a real-time operating system, you're going to see at least a slight variation in run times. This is due to OS scheduling, any I/O that might be happening at around that time, etc.

A difference of 0.04 ms is not a big difference at all.

If your program runs in a loop for at least several seconds, the percentage of variation should be reduced.

Upvotes: 6

Related Questions