Reputation: 101
Consider two algorithms, A, and B. These algorithms both solve the same problem, and have time complexities (in terms of the number of elementary operations they perform) given respectively by
a) (n) = 9n+6
b) (n) = 2(n^2)+1
(i) Which algorithm is the best asymptotically?
(ii) Which is the best for small input sizes n, and for what values of n is this the case? (You may assume where necessary that n>0.)
I think it's A. Am I right?
And what's the answer for part B? What exactly do they want?
Upvotes: 6
Views: 866
Reputation: 108356
I think plotting those functions would be very helpful to understand what's going on.
Upvotes: 7
Reputation: 39277
The answer to the first part is obviously (a) because O(n*n) > O(n).
The answer to the second part is that you cannot say "Which is the best for small input sizes" because you give no information about what the 'elementary operations' are in each case, nor how long each of those operations takes. The compiler or the CPU could apply some optimization which makes the so-called slower algorithm perform MUCH faster than the 'better' algorithm at small input sizes.
And by 'small' input sizes that could mean 10, 100 or 1 million+! The cross over between a clever O(n) algorithm and a dumb O(n*n) algorithm can be huge because compilers and CPUs are great at running dumb code really fast.
Many people make the mistake of 'optimizing' their code on the basis of O() without considering the size of the input, nor testing how well each performs with the data sets they will be using.
Of course that doesn't mean you should always right dumb code when there is a much better algorithm, or data structure that can do it in O(n) time, but it does mean you should think carefully before investing work to create a much cleverer (but much harder to maintain) algorithm that you think is optimal because it has a better O().
Upvotes: 0
Reputation: 1
it's obvious, if we have: * 9n+6 => n:5 => 9*5+6 = 45+6 = 51 * 2(n^2)+1 => n:5 => 2(5*5)+1 = 2*25+1 = 50+1 = 51 then 9n+6 is much better.
Upvotes: 0
Reputation: 4453
What I would do is run the algorithm several times for the cases you described (and on computers with different processors) and get the time when you start and the time you finish. Then look at the difference between their times for each of your cases.
Upvotes: 0
Reputation: 34621
Asymptotically, a(n) is better since it's O(n) as opposed to O(n2). Here is a plot the running time as a function of n. As you can see, a(n) is only slower for small values of n.
Upvotes: 0
Reputation: 655599
Which algorithm is the best asymptotically?
To answer this question, you just need to take a look at the exponents of n in both functions: Asymptotically, n2 will grow faster than n. So A ∈ O(n) is asymptotically the better choice than B ∈ O(n2).
Which is the best for small input sizes n, and for what values of n is this the case? (You may assume where necessary that n>0.)
To answer this question, you need to find the point of intersection where both functions have the same value. And for n=5 both functions evaluate to 51 (see 9n+6=2(n^2)+1 on Wolfram Alpha). And since A(4)=42 and B(4)=33, B is the better choice for n < 5.
Upvotes: 13
Reputation: 5667
Asymptotically, O(n) is better (cheaper) than O(n^2).
For small values of n, this is a simple algebra problem:
Find 'n' for which 9n+6=2(n^2)+1
: cleaning it up, we get the 2nd grade equation 2(n^2)-9n-5=0
. This yields n=5, which means that, for n=5, both processes would have the same cost:
This means that B is better for n<5, they are equal for n=5, and A is better for n>5. If you expect n to be smaller than 5 in the vast majority of cases, then B may be a better choice, but it will only be relevant if the algorithm is used a lot. If you get implemented it as a function, the minor benefits of B pale against the call overhead, so they won't be unnoticeable.
In summary, unless you are very sure of what you're up to, go ahead with A. In general, you always want the algorithm with better (cheaper) asymptotic cost. Only when you have the same generic order, or reliable knowledge about the input data you may be getting, deeper insight is worth the effort, and even then the best approach is benchmarking both versions with realistic data than theoretical analysis.
Upvotes: 1
Reputation: 2265
Simple graphing software will show that 9n+6 will perform better quite quickly as will simple algebra. At sets of 5 or more, 9n+6 will be faster.
Upvotes: 1
Reputation: 6450
9n + 6 IS the best.
Take this example, if your n is 10, then
9n + 6 = 96 2(n^2) + 1 = 201
now, take n is 100
9n + 6 = 906 2(n^2) + 1 = 20001
and it goes on and on...
if n = 4 then
9n + 6 = 40 2(n^2) + 1 = 33
Conclusion, the second one is better if n <= 4, but worst with 5 or more.
BTW, when calculating complexity of an algorithm, we usually end up dropping factor and constants because they do not affect the speed diference by much, so it should be simplify as a(n) = n and b(n) = n^2, which gives you a clear answer.
Upvotes: 1
Reputation: 370407
By looking at the constants it's easy to see that a will be larger than b at the beginning. By looking at the occurences of n (n in a, n^2 in b), you can see that b is larger asymptotically. So we only need to figure out from which point on b is larger than a. To do that we just need to solve the equation a(n) = b(n)
for n.
Upvotes: 1
Reputation: 14479
You should probably start by familiarizing yourself with asymptotics, big O notation, and the like. Asymptotically, a will be better. Why? Because it can be proven that for sufficiently large N, a(n) < b(n) for n> N.
Proof left as an exercise for the reader.
Upvotes: 1