Reputation: 52
Here I have two insertion sort algorithms. I am having trouble finding out the big O of these two forms of insertion sort. I have an iterative form and a recursive form. Am I wrong to say the iterative form is n^2 and the recursive form is n^2. If I am wrong what are they and why? How did you come to that answer?
public void iterativeSort(int[] list) {
start = System.currentTimeMillis();
for (int i = 1; i < list.length; i++) {
count++;
int temp = list[i];
int j;
for (j = i - 1; j >= 0 && temp < list[j]; j--) {
list[j + 1] = list[j];
}
list[j + 1] = temp;
finish += System.currentTimeMillis() - start;
}
}
public static void recursiveSort(int array[], int n, int j) {
finish += System.currentTimeMillis() - start;
start = System.currentTimeMillis();
if (j < n) {
int i;
count++;
int temp = array[j];
for (i = j; i > 0 && array[i - 1] > temp; i--) {
array[i] = array[i - 1];
}
array[i] = temp;
recursiveSort(array, n, j + 1);
}
}
Upvotes: 1
Views: 2327
Reputation: 4031
Yes, you are right that both implementations take O(n^2)
time. You cannot possibly reduce the running time of an algorithm by switching from recursive to iterative implementation or vice versa. This does not hold for the space usage though.
How can you determine that the running time is O(n^2)
. It is easier and more obvious for the iterative solution. In general, when you have nested for
-loops without any specific break conditions and you are running through a fraction of the elements that is linear, the running time is quadratic. Let's analyze it further. How many times will the condition in for (int i = 1; i < list.length; i++)
evaluate to true
? The answer is n-1
, since you go from the second element until the end. For example if n=5
, the condition will be true
for i = 1, 2, 3, 4
(due to 0-based indexing), exactly n-1
times, which in this example means 4. Now how many times will the inner loop condition evaluate to true
? On the first run it will get executed once, because i = 1
and j = 0
and after one iteration j
will be -1
which will break the condition. On the second iteration it will get executed twice, on the third three times etc, up to n - 1
times. So what we basically have is the sum 1 + 2 + 3 + ... + (n - 1)
which you can easily prove that is equal to (n-1)n/2)
. Since you drop constants in big-O, the running time is O(n^2)
.
Now the analysis of the second implementation may appear a bit more intricate because of the recursion, but it is actually not very different. The logic for the inner loop, for (i = j; i > 0 && array[i - 1] > temp; i--)
, is pretty much the same, because it gets executed once, when j = 1
, twice when j = 2
etc. How many times will we call the method recursively? Again n - 1
times, because the first call is j = 1
and therefore j < n
(assuming that n is large), which calls then recursiveSort(array, n, j + 1);
. Now j = 2
is again smaller than n
, so we will call the function recursively until j == n
, so exactly n - 1
times. Given that the inner loop is nested in O(n)
, we get the same number of iterations, namely 1 + 2 + 3 + ... + ( n-1 )
resulting in O(n^2)
again.
So we informally proved that the two algorithms have the same asymptotic running time. Can we consider them equivalent in this case? No. This is because each recursive call reserves additional space on the stack, meaning that the recursive solution takes O(n)
space, whereas the iterative one O(1)
. In that sense we may say that the iterative solution is better, which is usually the case, but the recursive one may be more readable(which is not the case here).
Upvotes: 4