Reputation: 37
From my textbook we have an algorithm that checks if there is a duplicate in an array, once the duplicate in the array is found, the method ends (this is the naive approach).
I am asked to find the Big Ω complexity in the worst case.
In this sort of example, the worst case would occur when the duplicates we are looking for are side by side in the end, or there is no duplicate in the array.
I am more confused of the run time on each line.
For example in the code i posted, the first for loop will run for (n-1) times were we have 9 inputs.
how many times, in n, would the second for loop run for in the worst case? It would check 36 times in this case.
I know that the worst case in Big O complexity would be (n^2). I know Big O means that f(n) must at most reach (<=) g(n). I know that Big Ω means that f(n) must at least reach (>=) g(n).
public class test2{
public static void main(String[] args){
int[] arrayint = new int[]{5,2,3,7,1,9,6,4,4};
for(int i = 0; i<arrayint.length-1; i++){
for(int j = i+1; j<arrayint.length;j++){
if(arrayint[i] == arrayint[j]){
System.out.println("duplicate found at " + i + " and " + j);
System.exit(0);
}
}
}
}
}
Upvotes: 1
Views: 399
Reputation: 134035
When i=0
, then j
will vary from 1
to 8
. The inner loop executes 8 times.
When i=1
, the inner loop executes 7 times (j = 2 to 8)
In general, when i=x
, the inner loop executes n-i-1
times.
If you sum all the iterations, you get (n*(n-1))/2
iterations. So for 9 items, that's (9*(9-1))/2)
, or 36 times.
Upvotes: 0
Reputation: 178471
I know that the worst case in Big O complexity would be (n^2). I know Big O means that f(n) must at most reach (<=) g(n). I know that Big Ω means that f(n) must at least reach (>=) g(n).
Actually, big O means f(n)
(for large enough inputs) must not reach c1 * g(n)
- for some constant c1
.
Similarly, for big Omega (Ω), f(n) must be higher than c2 * g(n)
(again, for large enough inputs).
This means, it's fine to have the same g(n)
for both big O and big Omega (Ω), because you can have:
c2 * g(n) <= f(n) <= c1 * g(n)
When this happens we say f(n) is in Ө(g(n))
, which mean the asymptotic bound is tight. You can find more information in this thread about big Theta(Ө)
In your case, the worst case is when there are no duplicates in the array. In this case, both O(n^2)
and Ω(n^2)
, which gives a tight bound of Ө(n^2)
.
As a side note, this problem is called the element distinctness problem, which is interesting because the problem itself (not the algorithm) has several asymptotic bounds, based on the computation model we use.
Upvotes: 2