Jeanne
Jeanne

Reputation: 1331

How to get support vector number after cross validation

Here is my code for digit classification using non linear SVM. I apply a cross validaton scheme to select the hyperparameter c and gamma. But, the model returned by GridSearch have not a n_support_ attribute to get the number of support vectors.

from sklearn import datasets
from sklearn.cross_validation import train_test_split
from sklearn.grid_search import GridSearchCV
from sklearn.metrics import classification_report
from sklearn.svm import SVC
from sklearn.cross_validation import ShuffleSplit


# Loading the Digits dataset
digits = datasets.load_digits()

# To apply an classifier on this data, we need to flatten the image, to
# turn the data in a (samples, feature) matrix:
n_samples = len(digits.images)
X = digits.images.reshape((n_samples, -1))
y = digits.target

# Split the dataset in two equal parts
X_train, X_test, y_train, y_test = train_test_split(
    X, y, test_size=0.5, random_state=0)

#Intilize an svm estimator
estimator=SVC(kernel='rbf',C=1,gamma=1)

#Choose cross validation iterator.
cv = ShuffleSplit(X_train.shape[0], n_iter=5, test_size=0.2, random_state=0)

# Set the parameters by cross-validation
tuned_parameters = [{'kernel': ['rbf'], 'gamma': [1e-3, 1e-4,1,2,10],
                     'C': [1, 10, 50, 100, 1000]},
                    {'kernel': ['linear'], 'C': [1, 10, 100, 1000]}]


clf=GridSearchCV(estimator=estimator, cv=cv, param_grid=tuned_parameters)

#begin the cross-validation task to get the best model with best parameters.
#After this task, we get a clf as a best model with best parameters C and gamma.
clf.fit(X_train,y_train)

print()

print ("Best parameters: ")

print(clf.get_params)


print("error test set  with clf1",clf.score(X_test,y_test))

print("error training set with cf1",clf.score(X_train,y_train))

#It does not work. So, how can I recuperate the number of vector support?
print ("Number of support vectors by class", clf.n_support_);

**##Here is my methods. I train a  new SVM object with the best parameters and I remark that it gate the same test and train error as clf**
clf2=SVC(C=10,gamma= 0.001);

clf2.fit(X_train,y_train)

print("error test set with clf2 ",clf2.score(X_test,y_test))

print("error training set with cf1",clf.score(X_train,y_train))

print clf2.n_support_

Any comment if my proposed method is right?

Upvotes: 1

Views: 1547

Answers (1)

Dimosthenis
Dimosthenis

Reputation: 981

GridSearchCV will fit a number of models. You can get the best one with clf.best_estimator_ so to find the indices of the support vectors in your training set you can use clf.best_estimator_.n_support_, and of course len(clf.best_estimator_.n_support_) will give you the number of support vectors.

You can also get the parameters and the score of the best model with clf.best_params_ and clf.best_score_ respectively.

Upvotes: 6

Related Questions