Ryan Delucchi
Ryan Delucchi

Reputation: 7748

Regressive n log(n) sorting

I have a sorting algorithm that requires performing an n log n sort n times, with n decrementing by 1 in each step? In other words: I would expect a performance of O(n log n + (n-1) log (n-1) + (n-2) log (n - 2) + ... ). This problem definitely doesn't strike me as something that hasn't been encountered before so I must ask:

What is the order-of-magnitude performance of an n log n sort n times, with n decrementing by 1 in each step?

Upvotes: 3

Views: 740

Answers (4)

Khaled.K
Khaled.K

Reputation: 5960

Time = SUM { k log k } for k: 1..n = log(H(n)) ~ Θ(log(H(n)))

H(n): Hyper-Factorial function in n

Asymptotic Approximation:

I'll try to deduce f(n) as an approximation for an upper bound, by generalizing k ..

f(n) = log n * SUM { k } for k: 1..n

f(n) = log n * 1/2 n (n+1)

f(n) = 1/2 n log n (n+1)

O(f(n)) = O(1/2 n^2 log n (n+1))

~ O(n^2 log n)

I'll try to deduce g(n) as an approximation for a lower bound, by generalizing log(k) ..

g(n) = n * SUM { log(k) } for k: 1..n

g(n) = n * log(1/2 n(n+1))

g(n) = n * (log(1/2) + log(n) + log(n+1))

g(n) = n * (c + log(n) + log(n+1))

g(n) = n * (c + log(n(n+1)))

Ω(g(n)) = Ω(n * (c + log(n^2+n))) = Ω(n * log(n^2+n))

~ Ω(n log(n^2+n))

So, we have:

Ω(n log(n^2+n)) < Θ(log(H(n))) < O(n^2 log n)

Example:

n = 100; Ω(922.02) < Θ(20,756.7) < O(46,051.7)

n = 1000; Ω(1.38 × 10^4) < Θ(3.2 × 10^6) < O(6.9 × 10^6)

Note: f(n) and g(n) are asymptotic approximation for bounds, they're not accurate ..

Upvotes: 1

Ryan Culpepper
Ryan Culpepper

Reputation: 10663

It's Theta(N^2 log N).

It's obviously O(N^2 log N).

To show that it's Omega(N^2 log N), consider only the big half of the sequence, where each value of k is at least N/2. For simplicity, assume that N is even.

Sum[k=1..N](k log k) >= Sum[k=N/2..N](k log k)           ; drop the small half
                     >= Sum[k=N/2..N]((N/2) log (N/2))   ; since each k >= N/2
                     >= N/2 * N/2 * log (N/2)
                      = N^2/4 * (log N - log 2)
                      = N^2/4 * logN - c
                      ∈ Omega(N^2 log N)

Upvotes: 0

Alexey Frunze
Alexey Frunze

Reputation: 62106

It's about 0.5 * n2 * log(n):

#include <stdio.h>
#include <math.h>

#define MAX 100
double sum[1 + MAX];

int main(void)
{
  int i;

  sum[0] = 0;

  for (i = 1; i <= MAX; i++)
  {
    sum[i] = sum[i - 1] + i * ceil(log2(i));
    printf("%i %.0f %.0f\n", i, sum[i], ceil(log2(i) * i * i / 2));
  }

  return 0;
}

Output (ideone):

1 0 0
2 2 2
3 8 8
4 16 16
5 31 30
6 49 47
7 70 69
8 94 96
9 130 129
10 170 167
...
90 25871 26293
91 26508 26946
92 27152 27608
93 27803 28279
94 28461 28959
95 29126 29647
96 29798 30344
97 30477 31050
98 31163 31764
99 31856 32488
100 32556 33220

Upvotes: 1

Daniel Williams
Daniel Williams

Reputation: 8895

N^2 log(N)

You are doing a NlogN operation N times so N times NlogN is the solution.

Oh after your edit it's quite different. It's very hard to figure out that summation but it's upper bound (which is all big O is) is still N^2 log(N). You may be able to figure out a closer upper bound but I think that would be a viable solution.

See https://math.stackexchange.com/questions/135787/asymptotic-formula-for-the-logarithm-of-the-hyperfactorial for a much more exact solution.

The much more exact solution is still bounded above by N^2 logN (by quite a bit) so I think it is still a safe upper bound.

Upvotes: 2

Related Questions