Reputation: 41
So far i worked is on the above with time complexity n^2, can anyone helper increase the efficiency to nlogn or below?
bool checkDuplicates( int array[], int n)
{
int i,j;
for( i = 0; i < n; i++ )
{
for( j = i+1; j < n; j++ )
{
if( array[i] == array[j] )
return true;
}
}
return false;
}
Upvotes: 3
Views: 3252
Reputation: 5880
You can use Sorting, Set or even Unordered Map to achieve your task.
O(N)
complexity prefer unordered mapUsing sorting
// Checking duplicates with Sorting, Complexity: O(nlog(n))
bool checkDuplicatesWithSorting(int array[], int n)
{
sort(array, array+n);
for(int i = 1; i < n; i++ )
{
if(array[i]==array[i-1])
return true;
}
return false;
}
Using Set
// Checking duplicates with a Set, Complexity: O(nlog(n))
bool checkDuplicatesWithSet(int array[], int n)
{
set <int> s(array, array+n);
return s.size() != n;
}
Using Unordered Map
// Checking duplicates with a Unordered Map, Complexity: O(n) (Average Case)
bool checkDuplicatesWithHashMap(int array[], int n)
{
unordered_map<int, bool>uo_map;
for(int i=0;i<n;i++){
if(uo_map.find(array[i]) == uo_map.end())
uo_map[array[i]] = true;
else return true;
}
return false;
}
Upvotes: 0
Reputation: 754
If you use C++, then this is a way:
#include <set>
bool hasDuplicates(int array[], int n)
{
std::set<int> no_duplicates(array, array + n);
return no_duplicates.size() != n;
}
Upvotes: 0
Reputation: 2754
Use a sorting algorithm (e.g. mergesort which has O(n log n)) and then check for consecutive elements that are equal. Or modify the merge step of mergesort to detect equal elements. Also, use size_t for indices, not int.
Upvotes: 4
Reputation: 1977
you can quicksort the array (n log(n)), then finding duplicates becomes linear (only need to check consecutive elements).
Upvotes: 4
Reputation: 7838
Assuming you can alter array
, do that first (n log n provided you're using a good algorithm). Next, walk array
and compare sequential elements (n).
Upvotes: 0