Reputation: 1080
I want to build a regression model for a data set, already know that :
x_1
and y
is quadratic relationship, x_2
and y
is linear; but not sure whether x_2
has quadratic relationship with y
, nor if x_1
and x_2
has some sort of interaction.
x_1: ['66.29', '40.96', '73.00', '45.01', '57.20', '26.85', '38.12', '35.84', '75.80', '37.41', '54.38', '46.19', '46.13', '30.37', '39.06', '79.38', '52.77', '55.92']
x_2: ['7.00', '5.00', '10.00', '6.00', '4.00', '5.00', '4.00', '6.00', '9.00', '5.00', '2.00', '7.00', '4.00', '3.00', '5.00', '1.00', '8.00', '6.00']
y: ['196.00', '63.00', '252.00', '84.00', '126.00', '14.00', '49.00', '49.00', '266.00', '49.00', '105.00', '98.00', '77.00', '14.00', '56.00', '245.00', '133.00', '133.00']
So I constructed that function:
But I don't know how to evaluate it, I tried curve_fit
in scipy
, yet seems it does not work for multiple independent variables. So is there a way to do that in python?
Upvotes: 1
Views: 649
Reputation: 517
Sckit-learn package in python includes both linear and polynomial regression models. Have a look at the link : linear and polynomial regression models.
Basically, y = c1 + c2 * x1 + c3 * x2 + c4 * x1^2 + c5 * x2^2 + c6 * x1 * x2
can be transformed by defining new variable z = [x1, x2, x1^2, x2^2, x1*x2]
.
With this transformation, the equation can be rewritten as
y = c1 + c2 z1 + c3 * z2 + c4 * z3 + c5 * z4 + c6 * z5
.
Thus, the problem of polynomial fitting has now been reduced to linear one and the linear model trained on polynomial features is able to exactly recover the input polynomial coefficients.
You can find several examples of polynomial regression in the link above.
Upvotes: 1