Reputation: 1
I'm doing some self study for a difficult exam (for me), and I cant grasp the concept of execution time of algorithms using the T(n) function.
For example:
i = 1; // c1 1
sum = 0; // c2 1
while (i <= n) { // c3 n+1
i = i + 1; // c4 n
sum = sum + i; // c5 n
}
Cost calculation:
Total Cost = c1 + c2 + (n+1).c3 + n.c4 + n.c5
T(n) = an^2 + bn + c
Is finding the total cost enough?
Please bare with my noobness, any resources will also be helpful.
Upvotes: 0
Views: 1061
Reputation: 178491
Finding exact running time is usually useless, since it depends on many things including your compiler optimizations, your hardware architecture, the amount of programs you run, your OS, and more. \
A simple example: How much time will it take:
for (int i = 0; i < 16; i++)
c[i] = a[i] + b[i]'`
The answer is - it depends. For example, many modern machines allow vector additions in a single instruction, that takes conciderably shorter than iterating and adding.
For the above reason, we seldom care for the exact theoretic time, and us the big O notation instead, or alternatively - compare running of some algorithm based on their actual performance - using statistical tests.
The complexity of your code under the Big O notation is O(n)
- since it involves iterating n
elements, and doing some constant time modification for each.
Upvotes: 2