Reputation: 11
I'm currently using Python's scikit-learn to create a support vector regression model, and I was wondering how one would go about finding the explicit regression equation of our target variable in terms of our predictors. It doesn't have to be simple or pretty, but is there a method Python has to output this (for a polynomial kernel, specifically)? I am fairly new to using SVR, and I am not certain of what to expect a regression equation to look like used in the prediction from a test observation after the regression is fit.
I've already fit an SVR model that predicts with a performance I'm happy with, and I've used GridSearchCV to tune hyper-parameters. However, I need an explicit form of my target variable in terms of the predictors for an independent optimization, and don't know how to find this equation.
from sklearn.svm import SVR
svr = SVR(kernel = 'poly', C = best_params['C'], epsilon = best_params['epsilon'], gamma = best_params['gamma'], coef0 = 0.1, shrinking = True, tol = 0.001, cache_size = 200, verbose = False, max_iter = -1)
svr.fit(x,y)
Where x is my matrix of observations, y is my vector of target values from the observations, and best_params is the output (optimal hyperparameters) found by GridSearchCV.
Does Python have any method for outputting the resulting equation of the SVR model used in predicting future target values from a set of predictors? Or is there a straightforward way of using values found by SVR to create an equation myself if I specify the kernel to be of polynomial type?
Thank you!
Upvotes: 0
Views: 2491
Reputation: 338
If you use a linear kernel, then you can output your coefficient.
For example
from sklearn.svm import SVR
import numpy as np
n_samples, n_features = 1000, 5
rng = np.random.RandomState(0)
coef = [1,2,3,4,5]
X = rng.randn(n_samples, n_features)
y = coef * X
y = y.sum(axis = 1) + rng.randn(n_samples)
clf = SVR(kernel = 'linear', gamma='scale', C=1.0, epsilon=0.2)
clf.fit(X, y)
clf.coef_
array([[0.97626634, 2.00013793, 2.96205576, 4.00651352, 4.95923782]])
Upvotes: 1