Reputation: 11
If N*C*(logN +N) represents computational time steps of Algorithm 1 and N*C*(logN +N*C) represents the computational time steps of Algorithm 2, then is it correct to say that both have computational complexity of O(N^2)?
*Where C is a constant value
Upvotes: 1
Views: 507
Reputation: 751
Yes that is correct, because f \element O(g) (Landau-Notation) means that your algorithm f increases slower than g. As both your algorithms increase slower than n^2, your assumption is correct.
Wrt the constants - let me depict this :)
Stating the complexity is O(n^2) implies the entire plane you can see from the nlog(n) to the n^2. That's where you neglect the constants. So your algorithm a can be far better than algorithm b, but still remain in the same Landau complexity, as this only gives an upper bound.
Upvotes: 1
Reputation: 859
One thing to remember .. The c values play an important role ..
lets say an algorithm runs taking some 32 passes over an array .
So the complexity of the algorithm is 32*n = C*n = O(n)
lets try to run the algorithm on an array size of 30 . SO its 32*30 .Which is n^2 operations .
We generally compute Big O analysis for big sets ..Really big .so it makes sense
Coming to your question .
They both run in O(n^2) as long as your constants C are comparable .So that depends on the Algorithms
Upvotes: -1
Reputation: 486
Yes, multiplying the number of steps by a constant has no effect on the Big-Oh notation of complexity.
Also, the N2 term dominates over the N log(N) as N gets large, which is what Big-Oh notation is designed to communicate.
Upvotes: 0
Reputation: 50149
Yes, here is my logic:
O(Cn(log n + Cn))
Remove constants
= O(n(log n + n))
Split multiplication
= O(n * log n + n^2)
Remove lesser term
= O(n^2)
It does not matter if C
is "very large", we only care about n
in Big-O notation as it is the growing term. When n gets large enough (approaching infinity for example), C
will become meaningless. In practice C
may make a huge impact but this is abstracted out in Big-O.
Upvotes: 1