Reputation: 143
I have just started learning and practicing algorithms and I read in a book that log(n!) has a faster growth rate than log(n).
log(n!) = log((n)*(n-1)*.....(1))
log(n!) = log(n)+log(n-1)+....+log(1) <--- (i)
taking the worst case from (i), which is log(n)
therefore, log(n) should have same growth rate as log(n!)
Then how is it that log(n!) is better than log(n)?
Upvotes: 0
Views: 1107
Reputation: 14861
Lets try a different approach
1.
n! = n*(n-1)*(n-2)*...*1
< n*n*n.....*n
n! < n^n
log(n!) < log(n^n)
log(n!) < nlogn
** log(n!) = O(nlogn)
2.
n! =
n*(n-1)*(n-2)*....*n/2*(n/2-1)*...*1 > (n/2)*(n/2)*(n/2)*...*(n/2)
---------------------------
n/2 times
This is true because:
n > n/2
n-1 > n/2
n-2 > n/2
.
.
n/2 == n/2
So now we know: log(n!) > log((n/2)^(n/2)
log(n!) > n/2*log(n/2)
*** log(n!) = Ω(nlogn)
From **
and ***
we get:
log(n!) = θ(nlogn)
Hence log(n) = O(log(n!))
Upvotes: 3