Reputation: 3490
I just bumped on to this question today and was trying for a solution that is better than O(N) but could not come up with one.
Searched through SO but couldn't find this question.
Is there any solution better than O(n) or is it a problem that cannot be solved better than that?
My initial thought was Binary Search but again for that you need to sort it which is again >n. I also thought of applying quicksort for just the half of the array to which the search element might belong but again we are making n comparisons initially and discarding the other half only later. Am I getting this right or am I looking at the solution in a wrong direction?
I was trying for a solution in c++ and no javascript's IndexOf() or C# Array.find() or LINQ's.
Upvotes: 43
Views: 123014
Reputation: 57
While looking for some faster method than linear search, I've just stumbled upon the "front-back" linear search method (https://medium.com/@insomniocode/search-algorithm-front-and-back-unsorted-86d7a4bfc258) which I tested on a few instances and turned out to be really faster but not significantly. Worth a try in my opinion!
Upvotes: 0
Reputation: 39
Usually, we check one element of array in one iteration... which takes n iterations to completely loop through the array... therefore, worst case time complexity becomes O(n).
for(int i=0;i<s;i++){ // s = array size
if(arr[i] == n) // n = element to be searched
return i;
}
but what I experimented is check multiple elements in a single iterations. let's say 5 elements per iteration. so, in that case the for loop will look like,
// s = array size
// n = element to be searched
for(int i=0;i<s;i+=5){ // notice the increment in i here...
if(arr[i] == n)
return i;
/* check the next four indexes as well as if arr[i] is the last element of the array */
else if( arr[i+1] == n && i+1 < s)
return i+1;
else if(arr[i+2] == n && i+2 < s)
return i+2;
else if(arr[i+3] == n && i+3 < s)
return i+3;
else if(arr[i+4] == n && i+4 < s)
return i+4;
}
here, theorotically time complexity should become O(n/5)...
but, when I executed the program by taking array of size 1000000 with the elements 1 to 1000000 arranged randomly and calculated the time taken by both the loops for different test cases of same array size... these were the results!
One element per iteration
5 elements per iteration
So, as I saw, it makes significant change in time complexity!
Upvotes: 3
Reputation: 522
Given the following array, you can just do parallel searching.
const array = [1, 2, 3, 4, 5, 6, 7, 3];
const search = 3;
for (let i = 0; i < array.length; i++) {
if (array[i] === search) {
console.log(i);
break;
}
if (typeof array[i + 1] !== "undefined") {
if (array[i + 1] === search) {
console.log(i + 1);
break;
}
if (typeof array[i + 2] !== "undefined") {
if (array[i + 2] === search) {
console.log(i + 2);
break;
}
if (typeof array[i + 3] !== "undefined") {
if (array[i + 3] === search) {
console.log(i + 3);
break;
}
if (typeof array[i + 4] !== "undefined") {
if (array[i + 4] === search) {
console.log(i + 4);
break;
}
if (typeof array[i + 5] !== "undefined") {
if (array[i + 5] === search) {
console.log(i + 5);
break;
}
if (typeof array[i + 6] !== "undefined") {
if (array[i + 6] === search) {
console.log(i + 6);
break;
}
if (typeof array[i + 7] !== "undefined") {
if (array[i + 7] === search) {
console.log(i + 7);
break;
}
}
}
}
}
}
}
}
}
Upvotes: -2
Reputation: 75
This could be solved by using some tricks. In an unsorted array simly if we traverse through it, the complexity in worst case (when element is present at last index) would be O(N), where N is the size of array. So, here is the trick. First check the last index so that if the element is present at last index (the worst case) our code will be executed in O(1). and after that while the code to traverse and find the element. So, now the worst case complexity would be O(N-1).
int findElement(int N, int arr[], int element){
if(arr[N]==element){
return i;
}
for(int i=0; i<N-1; i++){
if(arr[i]==element)
return i;
}
return -1;
}
Upvotes: 0
Reputation: 127
If you are not doing parallel search, then you can insert key to the end of array as sentinel value and do search with only 'n' comparisons instead of 2n comparisons.
For more details, refer the following question: What's the point of using linear search with sentinel?
Upvotes: 2
Reputation: 173
What will be the efficiency of algorithm that makes use of partition approach applied during quick-sort as follows?
Randomly select some value (let us call it v) in the list.
Partition the entire list into 2 parts. Left part contains all elements that are less than v. Right part contains all elements that are greater than v.
Repeat the steps 2, 3 until you determine whether the element exists or does not exist.
I am not sure about the complexity of above algorithm, but looks like it will be definitely less than the complexity of quick-sort algorithm: (n log n).
Upvotes: -1
Reputation: 29
It is possible to make your program run faster than O(n)
.
You start off by sorting the array using the merge sort algorithm, then you use binary search to find the element. Both algoro have a running time of O(log_2(n))
. Adding these two complexities together, you get 2*log_2(n)
, which is O(log_2(n))
with the witness C = 2
.
Upvotes: -5
Reputation: 1
Yet,there is another logic...
(Even numbers are stored in even address)
First check whether the search element is odd or even
If the search element is"even",then perform search only for even adress(Create loop increment to skip odd address)
For example:
You can divide the element and search in parallel or Use "pivot key" to sort to rest 50 elements or any other search method
Upvotes: -1
Reputation: 45
You can search an element with O(1) using this approach.
Just create a MAP . When you insert a value just then for that key assign value to '1', and to search it again just check if that array is present or not .
Below is the code:-
#include<bits/stdc++.h>
using namespace std;
int main(){
int n;
cin>>n;
map<int,int> map;
for(int i=0;i<n;i++){
int k;
cin>>k;
map[k]=1;
}
int num;
cin>>num;
if(map[num]){
cout<<"FOUND"<<endl;
}else{
cout<<"NOT FOUND"<<endl;
}
return 0;
}
Input:
5 // *no. of elements*
6 4 7 3 2 //*elements*
3 // *number to find*
Output :FOUND
Upvotes: 0
Reputation: 35126
Make it parallel. Divide the array in chunks and search in parallel. The complexity will be O(n) but running time will be much less. Actually it will be proportional to no. of processors you have.
You can use Parallel Patterns Library in C++
Upvotes: 35
Reputation:
If you're searching for one element once, just iterate through it. No possible way to get it faster.
If you're searching multiple times, it would be worth it to index it (or sort it, if you will) and make the following searches fast (log(n)).
Upvotes: 6
Reputation: 23265
You're right, the fastest way is to simply iterate through the array and look for it. Without further information, there is nothing better you can do.
Unless you have a quantum computer, that is.
Upvotes: 11