Reputation: 4792
I am using LinearSVC from scikit-learn library and I wonder if it is possible to somehow pull out the vectors which my model uses after training to make predictions. Tried to google it for some time but without any luck. Anyone knows?
Upvotes: 13
Views: 16063
Reputation: 163
There is actually a way: I found here how to obtain the support vectors from linearSVC - I'm reporting the relevant portion of code:
from sklearn.svm import LinearSVC
clf = LinearSVC('''whatever fits your specs''')
clf.fit(X,y)
# get the support vectors through the decision function
decision_function = clf.decision_function(X)
# we can also calculate the decision function manually as previously noted
# decision_function = np.dot(X, clf.coef_[0]) + clf.intercept_[0]
# The support vectors are the samples that lie within the margin
# boundaries, whose size is conventionally constrained to 1
support_vector_indices = np.where(np.abs(decision_function) <= 1 + 1e-15)[0]
support_vectors = X[support_vector_indices]
Upvotes: 1
Reputation: 126
I am not sure if it helps, but I was searching for something similar and the conclusion was that when:
clf = svm.LinearSVC()
Then this:
clf.decision_function(x)
Is equal to this:
clf.coef_.dot(x) + clf.intercept_
Upvotes: 8
Reputation: 1212
This could help you.
clf = svm.SVC( kernel='rbf',C=0.05)
clf.fit(traindata,y)
print clf.support_vectors_
This link can you more information if needed. http://scikit-learn.org/stable/modules/svm.html
Upvotes: 3
Reputation: 40973
Unfortunately there seems to be no way to do that. LinearSVC calls liblinear (see relevant code) but doesn't retrieve the vectors, only the coefficients and the intercept.
One alternative would be to use SVC with the 'linear' kernel (libsvm instead of liblinear based), but also poly
, dbf
and sigmoid
kernel support this option:
from sklearn import svm
X = [[0, 0], [1, 1]]
y = [0, 1]
clf = svm.SVC(kernel='linear')
clf.fit(X, y)
print clf.support_vectors_
Output:
[[ 0. 0.]
[ 1. 1.]]
liblinear scales better to large number of samples, but otherwise the are mostly equivalent.
Upvotes: 9