Reputation: 4072
I've seen a few interesting discussions recently debating whether or not a given ("hard") problem has at-best an 2^n or n! known solution.
My question is, aside from actually walking through the algorithm and seeing the growth rate, is there a heuristic for quickly spotting one versus the other? Ie. are there certain quickly-observable properties of an algorithm that make it obviously one or the other?
Related discussions:
Upvotes: 4
Views: 302
Reputation: 7817
As amit mentioned it is theoretically not possible to check if an algorithm is O(2^n) or O(n!). However you can use the following heuristics:
Upvotes: 1
Reputation: 178491
There is no algorithm that can determine a complexity of a program [at all]. It is a part of the Halting Problem - you cannot determine if a certain algorithm will stop or not. [You cannot estimate if it is Theta(infinity)
or anything less then it]
As a rule of thumb - usually O(n!)
algorithms are invoking recursive call in a loop with a decreasing range, while O(2^n)
algorithms invoke a recursive call twice in each call.
Note: Not all algorithms that invokes a recursive call twice are O(2^n)
- a quicksort is a good example for an O(nlogn)
algorithm which also invokes a recursive call twice.
EDIT: For example:
SAT brute-force solution O(2^n)
:
SAT(formula,vars,i):
if i == vars.length:
return formula.isSatisfied(vars)
vars[i] = true
temp = SAT(formula,vars,i+1) //first recursive call
if (temp == true) return true
vars[i] = false
return SAT(formula,vars,i+1) //second recursive call
Find all permutations: O(n!)
permutations(source,sol):
if (source.length == 0):
print sol
return
for each e in source:
sol.append(e)
source.remove(e)
permutations(source,sol) //recursive call in a loop
source.add(e)
sol.removeLast()
Upvotes: 6