Reputation: 21
I am confused on analyzing the time complexity on a 'for loop'. Do the contents in the loop not counted like example:
for(i=0;i<=n;i++){
statement
.
.
}
Wouldnt i=0 excute once, i<=n excute (n+1), and i++ execute n times. So why is the time complexity only counted (n+1) times for the 'for loop'? shouln't the contents inside be summed like: 1+n+1+n? just confused since my prof summed them up but the book only counted (n+1).
Upvotes: 1
Views: 308
Reputation: 9525
What you are asking is what the asymptotic notation is all about.
Say that you have a program like the following
for (i = 0; i < n; ++i) {
print("hello")
}
that compiles into a fictional pseudocode machine language in which each of the numbered items below are single machine instructions:
1. let i -> 0
2. if (i >= n) goto 6.
3. print "hello"
4. increment i
5. goto 2.
6. ... (continue with the rest of the program)
then for a given n the total number instructions that will run when executing the loop is 4n + 1, one instruction to initialize i and four instructions for the body of the loop which executes n times.
Big-O et. al. is about how a function like f(n) = 4n + 1 behaves as n grows. The definition of Big-O is basically that for large n's some function f(n) is O(g(n)) if there exists a constant k such that
f(n) ≤ k g(n).
In this case if g(n) = n we can see that, for all values of n greater than zero, f(n) ≤ 5 g(n) so 5 is a valid k and we can say that f(n) is O(n).
This is the point of Big-O: it lets us talk about how functions grow ignoring additive and multiplicative constants.
Upvotes: 4