Reputation: 79
Which is faster, running an algorithm with a running time of O(log n) once on an input of size 10n, or to run it ten times on inputs of size n each?
Upvotes: 0
Views: 1678
Reputation:
If you want to check, you can do limit test.
lim (log(10n)/log(n)), n->infinity
. (wolfram alpha)
if log(10n)
is 10 times faster then log(n)
, it should converges to 10.
log(10n)/log(n) = log_n(10n) = log_n(10) + 1
.
so when n goes to infinity, log_n(10)
converges to 0. (you can check it by inspect some log graphs)
Upvotes: 2
Reputation: 4106
O(log 10n) because logarithmic time increases much slower than linear time. You can count it yourslef, e.g.
For n = 1000: log n = 10, 10 log n = 100, log 10n = 13
Upvotes: 2