Jenny
Jenny

Reputation: 123

Big O Notation: differences between O(n^2) and O(n.log(n))?

What is the difference between O(n^2) and O(n.log(n))?

Upvotes: 3

Views: 5207

Answers (6)

Joe Koberg
Joe Koberg

Reputation: 26699

n^2 grows in complexity more quickly.

Upvotes: 15

gillonba
gillonba

Reputation: 917

n log(n) grows significantly slower

Upvotes: 1

acrosman
acrosman

Reputation: 12900

Algorithms that run in O(nlog(n)) time are generally faster than those that run in O(n^2).

Big-O defines the upper-bound on performance. As the size of the data set grows (n) the length of time it takes to perform the task. You might be interested in the iTunes U algorithms course from MIT.

Upvotes: 1

user59634
user59634

Reputation:

"Big Oh" notation gives an estimated upper bound on the growth in the running time of an algorithm. If an algorithm is supposed to be O(n^2), in a naive way, it says that for n=1, it takes a max. time 1 units, for n=2 it takes max. time 4 units and so on. Similarly for O(n log(n)), it says the grown will be such that it obeys the upper bound of O(n log(n)). (If I am more than naive here, please correct me in a comment).

I hope that helps.

Upvotes: 0

Johannes Weiss
Johannes Weiss

Reputation: 54041

Big O calculates an upper limit of running time relative to the size of a data set (n).

An O(n*log(n)) is not always faster than a O(n^2) algorithm, but when considering the worst case it probably is. A O(n^2)-algorithm takes ~4 times longer when you duplicate the working set (worst case), for O(n*log(n))-algorithm it's less. The bigger your data set is the more it usually gets faster using an O(n*log(n))-algorithm.

EDIT: Thanks to 'harms', I'll correct a wrong statement in my first answer: I told that when considering the worst case O(n^2) would always be slower than O(n*log(n)), that's wrong since both are except for a constant factor!

Sample: Say we have the worst case and our data set has size 100.

  • O(n^2) --> 100*100 = 10000
  • O(n*log(n)) --> 100*2 = 200 (using log_10)

The problem is that both can be multiplied by a constant factor, say we multiply c to the latter one. The result will be:

  • O(n^2) --> 100*100 = 10000
  • O(n*log(n)) --> 100*2*c = 200*c (using log_10)

So for c > 50 we get O(n*log(n)) > O(n^2), for n=100.

I have to update my statement: For every problem, when considering the worst case, a O(n*log(n)) algorithm will be quicker than a O(n^2) algorithm for arbitrarily big data sets.

The reason is: The choice of c is arbitrary but constant. If you increase the data set large enough it will dominate the effect of every constant choice of c and when discussing two algorithms the cs for both are constant!

Upvotes: 2

Marek Karbarz
Marek Karbarz

Reputation: 29304

You'll need to be a bit more specific about what you are asking, but in this case O(n log(n)) is faster

Upvotes: 1

Related Questions