Reputation: 93
I am working on a research in the class I tested bubble sort and insertion sort and quick sort , i did the test on random numbers. The results shows that insertion sort is more quicker than bubble sort and quick sort is the most slower one.
So I have the below ranking in terms of time
Taking into account that insertion and bubble sort have a complexity of O(n2) while quick sort O(n log n) and O (n log n) should be faster !!!
Could anybody share with me explanations?
Thanks
(NSMutableArray *)quickSort:(NSMutableArray *)a
{
// Log the contents of the incoming array
NSLog(@"%@", a);
// Create two temporary storage lists
NSMutableArray *listOne = [[[NSMutableArray alloc]
initWithCapacity:[a count]] autorelease];
NSMutableArray *listTwo = [[[NSMutableArray alloc]
initWithCapacity:[a count]] autorelease];
int pivot = 4;
// Divide the incoming array at the pivot
for (int i = 0; i < [a count]; i++)
{
if ([[a objectAtIndex:i] intValue] < pivot)
{
[listOne addObject:[a objectAtIndex:i]];
}
else if ([[a objectAtIndex:i] intValue] > pivot)
{
[listTwo addObject:[a objectAtIndex:i]];
}
}
// Sort each of the lesser and greater lists using a bubble sort
listOne = [self bubbleSort:listOne];
listTwo = [self bubbleSort:listTwo];
// Merge pivot onto lesser list
listOne addObject:[[NSNumber alloc] initWithInt:pivot]];
// Merge greater list onto lesser list
for (int i = 0; i < [listTwo count]; i++)
{
[listOne addObject:[listTwo objectAtIndex:i]];
}
// Log the contents of the outgoing array
NSLog(@"%@", listOne);
// Return array
return listOne;
}
Upvotes: 1
Views: 9624
Reputation: 178431
O(nlogn)
is "faster" then O(n^2)
, but let's recall what the big O notation mean.
It means that if algorithm A has complexity of O(nlogn)
, for some constants N_1
, and c1
, for each n>N
- the algorithm is "faster" then c1*n*log(n)
. If algorithm B has O(n^2)
, there are some constants N_2
,c2
such that the algorithm is "faster" then c2 * n^2 * log(n)
for n > N_2
.
However - what happens before this N
? and what is this constant C
? We do not know. Algorithm B
could still be "faster" then algorithm A
for small inputs, but for large inputs - indeed, A
will be faster (the asymptotic bound is better).
For example, let's take the case where algorithm A
runs in T_1(n) = 10* nlog(n)
ops, while algorithm B
runs in T_2(n) = n^2
. for n=3
- we get that T_1(3) = 10*3*2 = 60
(we tool the ceil for log(n)
), and T_2(3) = 9
- thus algorithm B
, though O(n^2)
is faster then A
for this input.
Regarding quick sort and insertion sort:
Quick sort is usually extremely fast, and decays into quadric time in very rare cases (the probability of this happening is slim if we chose a random element as a pivot).
However, the constant behind the big O notation in quicksort is greater then insertion sort. Thus - a possible optimization is: use quicksort until you get to a certain threshold (say 30 elements), and then sort this subarray with insertion sort rather then quicksort. This post discusses this optimization.
Bubble sort is (empirically) terrible for random arrays, but could be good if the array is almost sorted and the "out of place" elements are in its beginning.
Upvotes: 2
Reputation: 690
It depends on the array size. On small arrays simple algorithms (such as insertion sort) will do pretty well, there is no need for better algorithms.
However, when n is large (say n=10000000), quicksort will generally do much much better than insertion (or bubble) sort.
Upvotes: 2
Reputation: 117370
well, quicksort depends on input heavily. You need to SHUFFLE input before make quicksort. If your input is sorted, quicksort can have complexity of O(n2)
insertion sort also can be faster for small arrays
Upvotes: 3