Antonny Muiko
Antonny Muiko

Reputation: 11

How is the precision and recall calculated in the classification report?

Confusion Matrix :

[[4 2]

 [1 3]]

Accuracy Score : 0.7

Report :

              precision    recall  f1-score   support

          0       0.80      0.67      0.73         6

          1       0.60      0.75      0.67         4

avg / total       0.72      0.70      0.70        10

from the formular precision = true positive/(true positive + false positive)

4/(4+2) = 0.667

But this is under recall .

The formula to calculate recall is true positive/(true positive + false negative)

4/(4+1) = 0.80

I don't seem to get the difference .

Upvotes: 0

Views: 183

Answers (1)

Jlanday
Jlanday

Reputation: 112

Hard to say for sure without seeing code but my guess is that you are using Sklearn and did not pass labels into your confusion matrix. Without labels, it makes decisions about the ordering leading to false positives and false negatives being swapped by interpretting the confusion matrix.

Upvotes: 1

Related Questions