Reputation: 31
I have a sorted array, and I want to find efficiently the longest contiguous subsequense beginning to end so as array[begin]>=array[end] div 2
.
The obvious is (O^(n^2) ), but is there something better?
Upvotes: 3
Views: 77
Reputation: 22910
It can be done in linear time. First lets start with the quadratic :
i
j
at the position of index i+1
a[j]/2 <= a[i]
, increment ji
.The catch is to realize that if you fail on step 3 for a pair (i, j)
, then it means :
for every i < k < j, a[k] <= a[i]/2
a[j] > a[i]/2
Thus, at step 5, going to any k
smaller than j
will lead to a smaller score, because a[j] > a[i]/2 > a[k]/2
. Thus the next index to start with is j
.
By now we are visiting an index at most once when computing any score. Which reduces this step from O(n^2)
to O(n)
. Then taking the index with maximum score is obviously O(n)
.
Upvotes: 2