Reputation: 463
I need to curve fit a set of data using y = x / (a + x), where a is the parameter that I am required to get from this excercise.
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
x = [1, 2, 7, 10, 20, 70, 200, 1000]
y = [0, 0, 15.3, 34.6, 49.3, 82.6, 100]
def fit(x, a):
return x/(a+x)
par, con = curve_fit(fit, x, y)
plt.plot(x, fit(x, par[0]))
plt.show()
Using this I get some abomination of a fit. Doesn't even remotely fit.
If I try it like this:
def fit(x, a, b):
return b*x/(a+x)
I get a fit but it's without round cornerns. It's just straight lines. What am I doing wrong?
Upvotes: 1
Views: 586
Reputation: 54330
Notice that your x
is a list
of int
, in Python
division are by default integer division, which is not what you want here.
Therefore, a few changes will make it work, use the 2nd function as an example, your first function is not going to fit well as it will have a limit of 1 when x->inf:
def fit(x, a, b):
return b*x*1./(a+x)
A, B=curve_fit(fit, x, y)[0]
plt.plot(x, fit(x, A, B))
plt.plot(x, y, 'r+')
plt.savefig('temp.png')
It is a set of straight lines because you only calculate y
at those x
values, to get a curve: change the plotting call to plt.plot(np.linspace(0,200,100), fit(np.linspace(0,200,100), A, B))
Upvotes: 3