Doug Smith
Doug Smith

Reputation: 29326

How do I determine the time complexity of an algorithm when seemingly the constant does affect the complexity, when I was taught otherwise?

Take this example: Wolfram|Alpha

If I have n*(lgn) / lg(lg(n))

compared against

5000n

No matter how high n goes, 5000n will always be higher. But if I took away the constant, the opposite would be true. But I always thought that constants multiplying were ignored in the grand scheme of things (5000n would just be considered n) which would lead to 5000n being considered smaller, when it seems to be bigger.

When asked which are worse time complexity algorithms, how do I answer?

Upvotes: 0

Views: 171

Answers (2)

Patricia Shanahan
Patricia Shanahan

Reputation: 26185

Constant multipliers are deliberately ignored in asymptotic complexity analysis. For analysis of algorithms, in the abstract, it would be very difficult to count them - they depend on practical aspects of computer behavior that change with time.

For picking an algorithm, asymptotic complexity is only one measure to consider. It is also important to estimate or measure the actual time for expected problem sizes. Asymptotic complexity gives a warning if the algorithm is going to get drastically slower as the problem size increases.

Upvotes: 0

Dennis Meng
Dennis Meng

Reputation: 5187

The thing to remember is that when people compare complexity, they almost always consider asymptotic complexity, which is to say, which is greater as n gets really big.

In this case, although 5000n > n*log n/ log log n for basically any reasonable value of n, the latter still has the higher complexity.

Plugging in 5000 < log n/ log log n got me an answer around n = 2*10^23683 (click on "Approximate Forms"), and sure enough, 5000n < n*log n/ log log n for n = 2*10^23683.

Upvotes: 3

Related Questions