BuggerNot
BuggerNot

Reputation: 448

Why constants are not considered in analysis of algorithm efficiency?

Multiplicative constants are not considered in analysis of algorithm time efficiency because
A) they cancel out when computing efficiency functions
B) constant functions grow very slowly with input size growth
C) they have a small effect when input size is small
D) they can be overcome by faster machines
E) they do not affect the actual run time of the algorithm

My guess is "B", but I don't know correct answer. Are all options incorrect ?

Upvotes: 1

Views: 1085

Answers (2)

npostavs
npostavs

Reputation: 5027

I think the answer is D:

Multiplicative constants are not considered in analysis of algorithm time efficiency because D) they can be overcome by faster machines

Machines are becoming faster giving constant factor speedups, which overcomes the multiplicative constants, hence we can ignore the multiplicative constants for analysis.

I'd rather say we ignore multiplicative constants because they depend on the particular machine but for multiple choice we have to pick the best answer offered.

Upvotes: 1

So here's my comment extended to an answer:

B) constant functions grow very slowly with input size growth

This doesn't even make sense. A constant function doesn't even grow at all; however, here we are not talking about constant run-time functions, but about constant coefficients that may occur when estimating the actual number of "steps" given the asymptotic complexity of an algorithm.

In asymptotic analysis, however, we do not care about the exact number of steps, only the limit of the ratio of running times as a function of the input size as the input size goes to infinity.

E. g. O(n ^ 2) means that if you double the input size, the running time will be approximately 4 times the original, if you triple the input size, it will be 9 times the original, etc. It does not say that the execution will take exactly "4 steps" or "9 steps".

C) they have a small effect when input size is small

No, they have rather significant effects when the input size is small. Again, we are considering the limit as the input size approaches infinity. Any constant is asymptotically negligible compared to any non-constant monotonically growing function of n as n goes to infinity.

When n is small, then constants can have a tremendous effect on execution times. For example, there are all sorts of interesting and clever data structures, but if we only have small amounts of data, we often prefer arrays over e. g. a binary tree or a linked list, even for frequent insertion, because the good cache locality properties of the array make its constant factor so small that the theoretically O(n) insertion may well be a lot faster than an O(log n) insertion into a tree.

D) they can be overcome by faster machines

This answer completely misses the point, asymptotic analysis of algorithms has nothing to do with how fast physical machines are. Yes, machines are becoming faster over time, but again, that's just a constant factor. If you run a program for an O(n ^ 2) algorithm on a faster machine, it will still take 4 times the CPU time to execute it with a doubled input size.

E) they do not affect the actual run time of the algorithm

That's also wrong, they absolutely do.

So the only remaining answer is A, which may be correct if interpreted as in my explanation above (relating to ratios of execution times), but I would have phrased it quite differently for sure.

Upvotes: 5

Related Questions