Reputation: 43
For code in a similar form to this:
for(int i = 0; i < n; i+=2){
(Code that executes in constant time)
}
I have heard the running time for this should be O(N). But since the loop executes n/2 times shouldn't it be O(N/2)? Can anyone explain why i increasing by two each time wouldn't also decrease the time by a factor of 2?
Upvotes: 3
Views: 175
Reputation: 564
Constants can often be factored out. The "X" is what really eats up processing time, so constants aren't that big a deal, plus Big O time like this is an approximation and really cannot be exact since the actual time depends on so many more factors. It's the same reason that when you initialize "int i=0" in your for loop, you don't give a big O time of:
O(N+1)
So, O(N) is pretty much the same as O(N/2)
Despite other users insisting that they're the same, it is a mistake to ignore constants since huge constants can really impact the run time. So yes, while O(N) might be considered the same as O(N/2), it is pretty much the same in that constants can have a significant impact on the overall run time (consider O(10^12N), does the huge constant matter now?)
Upvotes: 0
Reputation: 955
If we go back to Big O notation definition, it states that f(x) ~ O(g(x))
if and only if f(x) <= C*g(x)
where C is a constant. The constant C can be adjusted to whatever is needed and in your case the constant is 2. Constants and lower order terms are not considered when we refer to big O notation because the higher order term will always be greater than them as per the definition.
For example O(N)
is always constant times(C) greater than N/c1 + c2
(c1 and c2 being constants), where C can be taken as C= c1+c2
Another example is if we take (N^2)+N
, we can ignore the lower order and say that complexity is O(N^2)
because we can take constant C as 2, so |N^2 + N| <= |N^2 + N^2| or 2|N^2|
We can also say that N/2 ~ O(N^2)
, however its not a tight upper bound. In complexity of algorithms we always strive towards finding the tightest bound, and since O(N)
is a much tighter upper bound we normally use it for single variable single degree functions.
Upvotes: 4
Reputation: 2646
When dealing with time complexity, numerical constants are ignored...the reason for this is that if you look at the long run of N and 1/2N, the constant does not radically change the result..Therefore the complexity is simply reduced to O(N)
So technically it is reduced by a factor of two, but the reduction is not great enough to take into consideration for overall run-time, therefore the run-time remains O(N)
Just to provide a picture example...The blue and red lines show that N and N/2 are basically the same in the long run...the yellowish line is Nlog(N) which by contrast does matter as you can see in the long run the time is far greater than the previous two mentioned..
Please Note: this answer is merely a reinforcement to why big O notation ignores constants, for a specific definition, refer to @hrv answer above
Upvotes: 1
Reputation: 729
So, let me try and explain you why it is so :
Your piece of code :
for(int i = 0; i < n; i+=2){
(Code that executes in constant time)
}
It actually depends on the underlying hardware but let us assume that each 'Assignment','Comparison','Arithmetic' ,each operation takes unit time .
So,
int i = 0
this executes only once. Time : 1 unit
i<n
this executes n/2 times. Time : n/2 units
i=i+2
Here, if you will see there is an Arithmetic operation as well as an Assignment operation both of which executes n/2 times,
Time : n/2 + n/2 = n units
At this point I am assuming there is nothing inside the for loop.
So total units of time required to run this loop : 1 + n/2 + n = 1 + (3n/2) units of time.
So, for small n (which actually is in tens of thousands, in context of computation power of the underlying processor), 1 + 3n/2 ~~ n.
as it takes a fraction of a second more/less with this small test set.
On the contrary, for large n(millions of thousands) ,(1+(3n\2)) < n
i.e. for large test data, every coefficient definitely has its importance and could significantly affect the total execution time of the respective piece of code.
Hope it helps.
Upvotes: 0
Reputation: 374
Big O notation does not specify how long a function takes to run. It is only an indication of how the function's completion time changes with an increase/decrease in values. O(N) indicates a linear growth in time; likewise, O(N/2) also indicates the exact same linear change. When writing the time complexity of code, you can ignore any coefficients, as these do not convey any additional meaning.
Upvotes: 2