Andrew Tobey
Andrew Tobey

Reputation: 923

time complexity of non-inplace binary search

Assuming that binary search is called upon a subarray of approximately length n/2 and that there are at most three comparions at a level I came up with T(n) = T(n/2) + 3 as a recurrence relation.

But: When I call binarySearch recursively isn't the cost of the slice proportional to n? Thus, isn't the solution then T(n) = nlogn rather than logn?

This is confusing to me. I checked Python's cost model, and (as expected) the cost of a slice is proportional to n.

Upvotes: 0

Views: 215

Answers (2)

AbcAeffchen
AbcAeffchen

Reputation: 14977

As always, there can be a difference of the complexity of an algorithm and its implementation.

If you implement binary search in a recursive function, that takes a copy of a part of the input array, this array has to be generated in O(n), so you are right, that this would lead to an overall complexity of O(n log n) (maybe worse, depending on what exactly you are copying).

So you should not implement it this way. Only hand over a reference to the array and min and max index values to limit the search to the part of the array you need. (or something like this)

Upvotes: 3

M. Shaw
M. Shaw

Reputation: 1742

Take a look at Master Theorem.

T(n) = T(n/2) + 3
log a / log b = 1
d = 1

Thus

O(n) = O(log n)

Upvotes: -1

Related Questions