Malice
Malice

Reputation: 348

Example of an algorithm whose runtime is O(n^2 log n)?

I have to construct an algorithm where it's upper bound is O(n2 log n). Can anyone provide any examples on what an O(n2 log n) algorithm would look like? I cannot seem to wrap my mind around it.

My mental image of it would be two nested for loops and within the second loop a log n operation is performed. Is this correct?

Upvotes: 0

Views: 2465

Answers (3)

templatetypedef
templatetypedef

Reputation: 373052

There are many ways to get a runtime of O(n2 log n) in an algorithm. Here's a sampler.

  • Sorting a list of n2 items efficiently. For example, if you take n items, form all n2 pairs of those items, and then sort them using something like heapsort, the runtime will be O(n2 log n2) = O(n2 log n). This follows from properties of logarithms: log n2 = 2 log n = O(log n). More generally, running an O(n log n)-time algorithm on an input of size n2 will give you an O(n2 log n) runtime.
  • Running Dijkstra's algorithm on a dense graph using a binary heap. The runtime of Dijkstra's algorithm on a graph with n nodes an m edges, using a binary heap, is O(m log n). A dense graph is one where m = Θ(n2), so Dijkstra's algorithm would take time O(n2 log n) in this case. This is also the time bound for running some other graph algorithms on dense graphs, such as Prim's algorithm when using a binary heap.
  • Certain divide-and-conquer algorithms. A divide-and-conquer algorithm whose recurrence is T(n) = 2T(n / √2) + O(n2) has a runtime of O(n2 log n). This comes up, for example, as a subroutine in the Karger-Stein minimum cut algorithm.
  • Performing n2 searches on a binary tree of n items. The cost of each search is O(log n), so this would work out to O(n2 log n) total work. More generally, doing any O(log n)-time operation a total of O(n2) times will give you this bound.
  • Naive construction of a suffix array. A suffix array is an array of all the suffixes of a string in sorted order. Naively sorting the suffixes requires O(n log n) comparisons, but since comparing two suffixes can take time O(n), the total cost is O(n2 log n).
  • Constructing a 2D range tree. The range tree data structure allows for fast querying of all points in k-D space within an axis-aligned box. In two dimensions, the construction time is O(n2 log n), though this can be improved to O(n log n) using some more clever techniques.

This is, of course, not a comprehensive list, but it gives a sampler of where O(n2 log n) runtimes pop up in practice.

Upvotes: 6

Nolan
Nolan

Reputation: 1074

Technically, any algorithm which is asymptotically faster than n^2 log n is called O(n^2 log n). Examples include "do nothing" algorithm Theta(1), binary search Theta(log n), linear search Theta(n), bubble sort Theta(n^2).

The algorithm you describe would be O(n^2 log n) too while also being Omega(n^2 log n) and thus Theta(n^2 log n):

for i in range(n):
    for j in range(n):
        # binary search in array of size n

Upvotes: 3

Bill the Lizard
Bill the Lizard

Reputation: 405985

One approach to constructing a O(n2 log n) algorithm is to start with a O(n3) algorithm and optimize it so one of the loops runs in log n steps instead of n.

That could be non-trivial though, so searching Google turns up the question Why is the Big-O of this algorithm N^2*log N? The problem there is:

Fill array a from a[0] to a[n-1]: generate random numbers until you get one that is not already in the previous indexes.

Even though there are faster algorithms to solve this problem, the one presented is O(n2 log n).

Upvotes: 1

Related Questions