Reputation: 53
The output of my program look like:
[[1000, 1500, 2000, 2500, 3000, 3500, 4000],
[437, 680, 917, 1115, 1476, 1668, 1912]]
It's two-dimensional array created via Numpy Library. In the first row is number of N which I pass to the function, and the second row is time-measurment of this function in micro-seconds (e.g. N=1000 time=437, N=1500 time=680).
Is there any simple way to determine which is the complexity of this function? I know I can paint a plot and just see this but my app needs to give me just the answer (Your function is (probably, of course) O(n) or O(n log n) or O(n^2)).
O(n) seems to be preety obvious - I just need to divide N/t for all array and check if it is constant, but I have no idea how to check another two?
Upvotes: 0
Views: 657
Reputation:
One can do all sorts of fancy curve fitting and model evaluation with sklearn
. But for a simple approach, measuring the variance of the logarithm of the thing that's expected to be constant will do. That is,
np.var
. The model with the smallest variance wins.For your example:
import numpy as np
n = np.array([1000, 1500, 2000, 2500, 3000, 3500, 4000])
t = np.array([437, 680, 917, 1115, 1476, 1668, 1912])
print(np.var(np.log(t/n))) # 0.001545...
print(np.var(np.log(t/(n*np.log(n))))) # 0.001348...
print(np.var(np.log(t/(n**2)))) # 0.18049...
So it's definitely not quadratic. The N*log(N)
is a slightly better fit than linear. (Trying a few other things, it seems N*sqrt(log(N))
is best.)
Upvotes: 1