Reputation: 187
I'm writing a spline class in Python. The method to calculate the the spline interpolated value requires the index of the closest x data points. Currently a simplified version looks like this:
def evaluate(x):
for ii in range(N): # N = len(x_data)
if x_data[ii] <= x <= x_data[ii+1]:
return calc(x,ii)
So it iterates through the list of x_data
points until it finds the lower index ii
of interval in which x
lies and uses that in the function calc
, which performs the spline interpolation. While functional, it seems like this would be inefficient for large x_data
arrays if x
is close to the end of the data set. Is there a more efficient or elegant way to perform the same functionality, which does not require every interval to be checked iteratively?
Note: x_data
may be assumed to be sorted so x_data[ii] < x_data[ii+1]
, but is not necessarily equally spaced.
Upvotes: 2
Views: 996
Reputation: 113998
this is exactly what bisect is for https://docs.python.org/2/library/bisect.html
from bisect import bisect
index = bisect(x_data,x)
#I dont think you actually need the value of the 2 closest but if you do here it is
point_less = x_data[index-1] # note this will break if its index 0 so you probably want a special case for that
point_more = x_data[index]
closest_value = min([point_less,point_more],key=lambda y:abs(x-y))
alternatively you should use binary search(in fact im pretty sure thats what bisect uses under the hood) .... it should be worst case O(log n)
(assuming your input array is already sorted)
Upvotes: 4