Floern
Floern

Reputation: 33904

Are there any real O(n^n) algorithms?

Is there any real Algorithm with a time complexity O(n^n), that isn't just a gimmick?

I can create such an Algorithm, like computing n^n in O(n^n) / Θ(n^n):

long n_to_the_power_of_m(int n, int m) {
    if(m == 0) return 1;
    long sum = 0;
    for(int i = 0; i < n; ++i)
        sum += n_to_the_power_of_m(n, m-1);
    return sum;
}

(needs more than 4 minutes to compute 10^10)

Or other way around: Are there any Problems, which cannot be solved better than in O(n^n)?

Upvotes: 39

Views: 44793

Answers (6)

Quack E. Duck
Quack E. Duck

Reputation: 678

Maybe not the most practical application, but still interesting: if you draw n^n dots (which of course is O(n^n)), with the following criteria:

  • Form a polygon with n vertices
  • Starting at each of these vertices, form another (smaller) polygon with n vertices
  • Repeat step 2 out to n iterations
  • Only draw dots on the nth iteration

you will end up with a really neat fractal pattern, dependent on the scale factor used to decrease polygon size each iteration.

If n = 6, this simple dot-drawing algorithm produces a pattern which draws many instances of the Koch snowflake fractal in the negative space between the dots (you can see this in the example Processing.js program)

Try experimenting with different values of n and scale factors (although you'll probably be limited to upper bound of around n = 8 or 9 by browser performance constraints) and you can get some of the other fractals described here: https://en.wikipedia.org/wiki/N-flake#Hexaflake

(The one in the example is called a Sierpinski hexagon)

<!DOCTYPE html>

<html>
 <head>
    <title>Sierpinski Hexagon</title>
    <style>
        body {
            background-color: darkblue;
        }
        #canvasDiv {
            margin-left: 1%;
            margin-right: 1%;
            text-align: center;
        }
    </style>
</head>

<body>
    <div id="canvasDiv">
        <canvas id="_canvas"></canvas>
    </div>
</body>
 
<script src="https://cdn.jsdelivr.net/processing.js/1.4.8/processing.min.js"></script>
 
<script>
    var canvasWidth = 900;
    var canvasHeight = 800;
    
    var X = canvasWidth/2;
    var Y = canvasHeight/2;
    
    var N = 6;
    var R = 250;
    
    // For N = 6, Koch snowflake fractal pattern is most clearly visible with scaleDown = 0.42
    // 0.5 is just a plain hexagon
    // 0.45, 0.48, and 0.7 are interesting
    // 0.33 is very crisp, but many dots overlap
    var scaleDown = 0.42;
    
    
    var toRadians = function(degrees) {
        return degrees * Math.PI / 180;
    };
    
    var polygonVertices = function(p, n, centerX, centerY, r, clr) {
        p.strokeWeight(1);
        p.stroke(clr[0], clr[1], clr[2]);
        var theta = 360/n;
        for (var v = 0; v < n; v++) {
            p.point(centerX + r * Math.cos(toRadians(theta * v)), centerY + r * Math.sin(toRadians(theta * v)));
        }
    };
    
    var hyperfactorial = function(p, n, centerX, centerY, r, scaleDownFactor, depth) {
        if (depth == n) {
            polygonVertices(p, n, centerX, centerY, r, [Math.abs(X - centerX) * 500 / canvasWidth, Math.abs(Y - centerY) * 500 / canvasWidth, 255]);
            return
        }
        else {
            var theta = 360/n;
            for (var i = 0; i < n; i++) {
                hyperfactorial(p, n, centerX + r * Math.cos(toRadians(theta * i)), centerY + r * Math.sin(toRadians(theta * i)), r*scaleDownFactor, scaleDownFactor, depth + 1);
            }
        }
    };
    
    var applyProcessing = function(p) {
        p.setup = function() {
            p.size(canvasWidth, canvasHeight);
            p.background(0, 0, 40);
            hyperfactorial(p, N, X, Y, R, scaleDown, 1);
        };
    };
    
    var canvas = document.getElementById("_canvas");
    var pInstance = new Processing(canvas, applyProcessing);
    
 </script>

</html>   

Upvotes: 0

Paul Hankin
Paul Hankin

Reputation: 58339

The program that takes a description of a (terminating) Turing machine, and returns the number of steps it takes to terminate. This is a relatively simple program to write -- it can simply emulate the Turing machine, and count the steps.

The complexity of this program has no computable upper bound (and in particular grows faster than any computable function), so certainly grows faster than O(n^n).

The worst-case run-time on an input of size n is BB(n), the Busy Beaver sequence, which starts 0, 1, 4, 6, 13, is unknown after this (although lower bounds exists -- for example the next two values are at least 47176870 and 7.412×10^36534 respectively) and uncomputable for n large enough.

Upvotes: 2

Himadri Choudhury
Himadri Choudhury

Reputation: 10353

What you have coded in your example is very similar to a depth first search. So, that's one answer.

A depth first search algorithm without any special characteristics ( like re-convergent paths that can be optimized out ), should be n^n.

This is actually not a contrived example. Chess programs operate on the same algorithm. Each move there are n moves to consider ( i.e. branches ), and you search d moves deep. So that becomes O(n^d)

Upvotes: 26

Ted Hopp
Ted Hopp

Reputation: 234847

There are computations (for instance, tetration) where the output size is O(nn). It's kind of hard to compute them with time complexity less than O(nn).

Upvotes: 11

kennytm
kennytm

Reputation: 523614

According to Wikipedia, there are some double exponential time problems O(22poly(n)) which is more complex than O(nn), e.g. "Decision procedures for Presburger arithmetic" (O(22cn)) and "Computing a Gröbner basis" (in worst case O(22n/10)

Upvotes: 6

x4u
x4u

Reputation: 14077

There are many optimization problems that are essentially O(n!), i.e in data compression. The common algorithms for this all need to cheat one way or another (many rely on heuristics) but can't make sure that they have found the perfect result this way. I.e. choosing the optimal line filters during compression of a PNG image is such a problem that is comparatively easy to understand.

Another example are algorithms to break encryption which can potentially be even worse than O(n!).

Upvotes: 3

Related Questions