sAHDAhdjasjs
sAHDAhdjasjs

Reputation: 1

One Class SVM fails on classifying training set

I'm currently working on a project to classify windows of time series as outliners/inliners with OCSVM. While doing some testing i stumbled upon the following problem/question:

>>> from sklearn import svm
>>> train = [(0,0,0),(0,0,1),(1,1,1)]
... 
>>> clf = svm.OneClassSVM()
... 
>>> clf.fit(train)
OneClassSVM(cache_size=200, coef0=0.0, degree=3, gamma='auto', kernel='rbf',
      max_iter=-1, nu=0.5, random_state=None, shrinking=True, tol=0.001,
      verbose=False)
>>> clf.predict(train)
array([-1, -1,  1])

Why does the classifier fail here? -1 means outliner... but im predicting the training set? So everything should be 1 (inliner).

What do i miss? Any ideas?

Best regards

Upvotes: 0

Views: 847

Answers (1)

doodhwala
doodhwala

Reputation: 358

Looking at the docs, it seems that the nu parameter which is 0.5 by default gives a bound on the training errors and support vectors and since 0.5*3(training examples) is 1.5, the training continues till there are one or two examples correctly classified.

Tweaking the nu value should help you classify more points but you may risk overfitting.

Upvotes: 1

Related Questions