Cratylus
Cratylus

Reputation: 54094

Seeking for best algorithm (asymptotically) and ignoring other details

There are cases that a brute force approach on a problem has a complexity that is not good enough performance-wise.

Let's take for example e.g. Theta(n^2).

Using a recursive approach it can be improved to Theta(nlogn).

Obviously, asymptotically one would opt to use the recursive approach since for larger and larger input N the order of growth is lower i.e. better performance.

My question is the following:

If asymptotically as N becomes larger and larger the recursive(i.e. divide and conquer approach) performs better, isn't it unrealistic to ignore that as we recurse on huge inputs of N we could eventually run out of stack?
As a result for huge input we actually never get a result.

So how can we be sure that we choose the best algorithm for a specific problem if we ignore these details?

Upvotes: 2

Views: 136

Answers (2)

mcdowella
mcdowella

Reputation: 19621

Here is a useful trick that I first saw done in quicksort:

f(x)
{
  for (;;)
  {
    if (x is trivial)
    {
      return;
    }
    x1 = one half of problem x
    x2 = other half of problem x
    if (x1 is the little half)
    {
      x = x2;
      f(x1);
    }
    else
    {
      x = x1;
      f(x2);
    }
  }
}

The point of this is that each recursive call is on the little half of the problem - typically less than half the size of the original problem - and the larger half of the problem is done via iteration. If the problem is of size 2^n, the stack need be of size only n, even if - because of very uneven splits - you end up taking time close to n^2 instead of n log n.

As far as the "real cost" goes of algorithms which use n log n time but loads of stack, note that an algorithm that runs in time n log n does not have the time to use more than n log n memory, so loads and loads of stack can't actually be huge amounts of memory, although it might be enough to make e.g. a Java interpreter crash out your program under the impression that you have runaway recursion.

Upvotes: 0

Szabolcs
Szabolcs

Reputation: 25703

It is possible to implement recursive algorithms without using the hardware stack. If the possibility of a stack overflow is a serious concern, you can implement your own stack or simply limit the depth of the recursion.

However, if the algorithm is dividing a set of data (of size n) recursively, then the number of recursions will be proportional to log n. This means that to exhaust a stack of size s you need data of size on the order of 2^s, which grows extremely quickly with s. People tend to not appreciate how quickly the exponential function is growing. My point is that with this type of algorithm (splitting a set of data in each recursive step) it is quite unlikely that the input data can be large enough to need so many recursions as to exhaust the stack.

EDIT: But then I'm not a professional programmer, so I'm lacking on the practical experience front.

Upvotes: 4

Related Questions