Reputation: 29
We were learning in class that insertion sorts are omega linear run time (if passed an already-sorted array) and Big-O O(n^2) for all other cases. Our prof then began trailing on about the ideal for an insertion sort having a "merge sort" approach and that the ideal would be to use merge sort and have an O (nlogn) run time? He's not a very clear person... at all. Can you explain what he meant, please!
Upvotes: 1
Views: 913
Reputation: 903
Let's see.
Insertion sort has O(n^2) run time
It's clear that insertion sort run on O(n^2). You can read more about it here.
Our prof then began trailing on about the ideal for an insertion sort having a "merge sort" approach
I think you tried to say idea
and not ideal
. Let's look at this idea
on insertion sort having "merge sort" approach. The basic of merge sort come from divide and conquer paradigm. So lets assume that i have this array
6 5 4 3 2 1
Having insertion sort with "merge sort" approach will make that array into two regional (or more) base on divide and conquer paradigm.
6 5 4 | 3 2 1
After that, we can apply insertion sort at each side of regional and then use conquer to join them.
4 5 6 | 1 2 3
1 2 3 4 5 6
Well, now let's take a look at IF we apply divide to each element of array.
6 | 5 | 4 | 3 | 2 | 1
WAIT A MINUTE, isn't that merge sort? Yap, that's why your sensei said
that the ideal would be to use merge sort and have an O (n log n) run time
Actually, some implementation on merge sort used an insertion sort when some threshold is activated (like having only 7 elements). You can read it here.
Upvotes: 1
Reputation:
O(N) is considered fast, O(N Log(N)) fair and O(N²) slow.
For a small number of elements, you may not care. But think for a second about sorting a million elements: times would be proportional to 1000000 (say 1 ms), 20000000 (20 ms) and... 1000000000000 (11 weeks).
This is why O(N²) sorting algorithms are often avoided, knowing that O(N Log(N)) is possible in all cases and O(N) for some configurations.
Upvotes: 1
Reputation: 5403
I possibly have an inkling about what your professor could have meant. For arrays of very small orders, (or to simply see if an array is sorted), it could potentially make sense to run insertion sort
since it consumes no auxiliary space and runs in linear time. Post that, you would run a merge sort, which would average an nlog(n)
regardless of the conditions (unless you are using a Natural Merge Sort, in which case your best case comes down to O(n) and it makes even less sense to have run that insertion sort algorithm), and consume and auxiliary space of O(n).
As good as that it, it may be worth noting that language like Java use a dual-pivoted quick sort algorithm instead of a merge sort (worst case quick sort is O(n^2)). I am not entirely sure about this, but it could be attributed to the fact that a quick sort uses less auxiliary space, but more so, because it's not very common for an array to hit the worst case.
Upvotes: 0