Reputation: 1
I am trying to get a smooth curve for my data points. Say (lin_space,rms) are my ordered pairs that I need to plot. For the following code-
spl=UnivariateSpline(lin_space,rms)
x=np.arange(0,1001,0.5)
plt.plot(lin_space,rms,'k.')
plt.plot(lin_space,spl(lin_space),'b-')
plt.plot(x,np.sqrt(x),'r-')
After smoothing with UnivariateSpline I am getting the blue line whereas I need my plots like the red one like shown (with no local extremums)
Upvotes: 0
Views: 285
Reputation: 5549
You'll want a more limited class of models.
One option, for the data that you have shown, is to do least squares with a square-root function. That should produce good results.
A running average will be smooth(er), depending on how you weight the terms.
A Gaussian Process regression with an RBF + WhiteNoise kernel might be worth checking into, with appropriate a priori bounds on the length scale of the RBF kernel. OTOH, your residuals aren't normally distributed, so this model may not work as well for values toward the edges.
Note: If you specifically want a function with no local extrema, you need to select a class of models that has that property. e.g. fitting a square root function.
import numpy as np
import matplotlib.pyplot as plt
import matplotlib as mpl
import sklearn.linear_model
mpl.rcParams['figure.figsize'] = (18,16)
WINDOW=30
def ma(signal, window=30):
return sum([signal[i:-window+i] for i in range(window)])/window
X=np.linspace(0,1000,1000)
Y=np.sqrt(X) + np.log(np.log(X+np.e))*np.random.normal(0,1,X.shape)
sqrt_model_X = np.sqrt(X)
model = sklearn.linear_model.LinearRegression()
model.fit(sqrt_model_X.reshape((-1,1)),Y.reshape((-1,1)))
plt.scatter(X,Y,c='b',marker='.',s=5)
plt.plot(X,np.sqrt(X),'r-')
plt.plot(X[WINDOW:],ma(Y,window=WINDOW),'g-.')
plt.plot(X,model.predict(sqrt_model_X.reshape((-1,1))),'k--')
plt.show()
Upvotes: 1