GBa
GBa

Reputation: 18447

What sort algorithm provides the best worst-case performance?

What is the fastest known sort algorithm for absolute worst case? I don't care about best case and am assuming a gigantic data set if that even matters.

Upvotes: 4

Views: 39280

Answers (16)

TStamper
TStamper

Reputation: 30374

It depends on the size, according to the Big O notation O(n).

Here is a list of sorting algorithms BEST AND WORST CASE for you to compare. My preference is the 2 way MergeSort

Upvotes: 1

mkoryak
mkoryak

Reputation: 57988

make sure you have seen this:

visualizing sort algorithms - it helped me decide what sort alg to use.

Upvotes: 17

Captain Segfault
Captain Segfault

Reputation: 1726

If you have a gigantic data set (ie much larger than available memory) you likely have your data on disk/tape/something-with-expensive-random-access, so you need an external sort.

Merge sort works well in that case; unlike most other sorts it doesn't involve random reads/writes.

Upvotes: 2

vartec
vartec

Reputation: 134701

Depends on data. For example for integers (or anything that can be expressed as integer) the fastest is radix sort which for fixed length values has worst case complexity of O(n). Best general comparison sort algorithms have complexity of O(n log n).

Upvotes: 9

ShuggyCoUk
ShuggyCoUk

Reputation: 36476

For the man with limitless budget

Facetious but correct: Sorting networks trade space (in real hardware terms) for better than O(n log n) sorting!

Without resorting to such hardware (which is unlikely to be available) you have a lower bound for the best comparison sorts of O(n log n)

O(n log n) worst case performance (no particular order)

Beating the n log n

If your data is amenable to it you can beat the n log n restriction but instead care about the number of bits in the input data as well

Radix and Bucket are probably the best known examples of this. Without more information about your particular requirements it is not fruitful to consider these in more depth.

Upvotes: 6

Mark Ransom
Mark Ransom

Reputation: 308520

On the importance of specifying your problem: radix sort might be the fastest, but it's only usable when your data has fixed-length keys that can be broken down into independent small pieces. That limits its usefulness in the general case, and explains why more people haven't heard of it.

http://en.wikipedia.org/wiki/Radix_sort

P.S. This is an O(k*n) algorithm, where k is the size of the key.

Upvotes: 0

Rick Copeland
Rick Copeland

Reputation: 11922

If you are using binary comparisons, the best possible sort algorithm takes O(N log N) comparisons to complete. If you're looking for something with good worst case performance, I'd look at MergeSort and HeapSort since they are O(N log N) algorithms in all cases.

HeapSort is nice if all your data fits in memory, while MergeSort allows you to do on-disk sorts better (but takes more space overall).

There are other less-well-known algorithms mentioned on the Wikipedia sorting algorithm page that all have O(n log n) worst case performance. (based on comment from mmyers)

Upvotes: 7

The lowest upper bound on Turing machines is achieved by merge sort, that is O(n log n). Though quick sort might be better on some datasets.

You can't go lower than O(n log n) unless you're using special hardware (e.g. hardware supported bead sort, other non-comparison sorts).

Upvotes: 0

acrosman
acrosman

Reputation: 12900

It depends both on the type of data and the type of resources. For example there are parallel algorithms that beat Quicksort, but given how you asked the question it's unlikely you have access them. There are times when the "worst case" for one algorithm is "best case" for another (nearly sorted data is problematic with Quick and Merge, but fast with much simpler techniques).

Upvotes: 1

Vatine
Vatine

Reputation: 21288

If you have a sufficiently huge data set, you're probably looking at sorting individual bins of data, then using merge-sort to merge those bins. But at this point, we're talking data sets huge enough to be VASTLY larger than main memory.

I guess the most correct answer would be "it depends".

Upvotes: 1

Zifre
Zifre

Reputation: 27008

Quicksort is usually the fastest, but if you want good worst-case time, try Heapsort or Mergesort. These both have O(n log n) worst time performance.

Upvotes: 3

Adam Robinson
Adam Robinson

Reputation: 185703

I've always preferred merge sort, as it's stable (meaning that if two elements are equal from a sorting perspective, then their relative order is explicitly preserved), but quicksort is good as well.

Upvotes: 0

Alex Fort
Alex Fort

Reputation: 18819

It all depends on the data you're trying to sort. Different algorithms have different speeds for different data. an O(n) algorithm may be slower than an O(n^2) algorithm, depending on what kind of data you're working with.

Upvotes: 0

Paul Tomblin
Paul Tomblin

Reputation: 182870

See Quick Sort Vs Merge Sort for a comparison of Quicksort and Mergesort, which are two of the better algorithms in most cases.

Upvotes: 0

TheTXI
TheTXI

Reputation: 37905

It largely is related to the size of your dataset and whether or not the set is already ordered (or what order it is currently in).

Entire books are written on search/sort algorithms. You aren't going to find an "absolute fastest" assuming a worst case scenario because different sorts have different worst-case situations.

Upvotes: 1

user82238
user82238

Reputation:

Assuming randomly sorted data, quicksort.

O(nlog n) mean case, O(n^2) in the worst case, but that requires highly non-random data.

You might want to describe your data set characteristics.

Upvotes: 0

Related Questions