Pramod Patil
Pramod Patil

Reputation: 827

Sensitivity derived from Scikit_Learn Confusion Matrix and Scikit_Learn Recall_Score doesn't match

true = [1,0,0,1]
predict = [1,1,1,1]

cf = sk.metrics.confusion_matrix(true,predict)
print cf

array

([[0, 2],

[0, 2]])

tp = cf[0][0]
fn = cf[0][1]
fp = cf[1][0]
tn = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)

0.0

print(sk.metrics.recall_score(true, predict))

1.0

As per Scikit documentation "Recall_Score" definition has to match. Can somebody explain bit more about this?

Upvotes: 0

Views: 1137

Answers (1)

Sayali Sonawane
Sayali Sonawane

Reputation: 12599

Confusion matrix labels must be updated in following way:

tn = cf[0][0]
fp = cf[0][1]
fn = cf[1][0]
tp = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)

1.0

Upvotes: 1

Related Questions