Minglee Cheong
Minglee Cheong

Reputation: 13

Is it possible to retrieve false positive and false negative from confusion matrix in R?

I have generated a confusion matrix using R as follows.

Is it possible to retrieve the false negative value of 61 from this matrix and assign to a variable in R? $byClass does't seem to work for this case. Thanks.

Confusion Matrix and Statistics

              Reference
    Prediction   no  yes
           no  9889   61
           yes    6   44

               Accuracy : 0.9933          
                 95% CI : (0.9915, 0.9948)
    No Information Rate : 0.9895          
    P-Value [Acc > NIR] : 4.444e-05       

                  Kappa : 0.5648          
 Mcnemar's Test P-Value : 4.191e-11       

            Sensitivity : 0.9994          
            Specificity : 0.4190          
         Pos Pred Value : 0.9939          
         Neg Pred Value : 0.8800          
             Prevalence : 0.9895          
         Detection Rate : 0.9889          
   Detection Prevalence : 0.9950          
      Balanced Accuracy : 0.7092          

       'Positive' Class : no     

Upvotes: 1

Views: 5081

Answers (1)

eipi10
eipi10

Reputation: 93871

You haven't provided a reproducible example or loaded any packages in your code, but it looks like you're using confusionMatrix from the caret package. Here's a generic example:

library(caret)

# Fake data
dat = data.frame(measured=rep(0:1, c(40,60)), modeled=rep(c(0:1,0:1), c(30,10,20,40)))

# Generate confusion matrix
cm = confusionMatrix(dat$modeled, dat$measured, positive="1")

cm
Confusion Matrix and Statistics

          Reference
Prediction  0  1
         0 30 20
         1 10 40

               Accuracy : 0.7             
                 95% CI : (0.6002, 0.7876)
    No Information Rate : 0.6             
    P-Value [Acc > NIR] : 0.02478         

                  Kappa : 0.4             
 Mcnemar's Test P-Value : 0.10035         

            Sensitivity : 0.6667          
            Specificity : 0.7500          
         Pos Pred Value : 0.8000          
         Neg Pred Value : 0.6000          
             Prevalence : 0.6000          
         Detection Rate : 0.4000          
   Detection Prevalence : 0.5000          
      Balanced Accuracy : 0.7083          

       'Positive' Class : 1

cm is actually a list, so let's see what it contains:

str(cm)

List of 6
 $ positive: chr "1"
 $ table   : 'table' int [1:2, 1:2] 30 10 20 40
  ..- attr(*, "dimnames")=List of 2
  .. ..$ Prediction: chr [1:2] "0" "1"
  .. ..$ Reference : chr [1:2] "0" "1"
 $ overall : Named num [1:7] 0.7 0.4 0.6 0.788 0.6 ...
  ..- attr(*, "names")= chr [1:7] "Accuracy" "Kappa" "AccuracyLower" "AccuracyUpper" ...
 $ byClass : Named num [1:11] 0.667 0.75 0.8 0.6 0.8 ...
  ..- attr(*, "names")= chr [1:11] "Sensitivity" "Specificity" "Pos Pred Value" "Neg Pred Value" ...
 $ mode    : chr "sens_spec"
 $ dots    : list()
 - attr(*, "class")= chr "confusionMatrix"

It looks like cm$table has the actual confusion matrix:

cm$table
          Reference
Prediction  0  1
         0 30 20
         1 10 40

So the count of false positives is:

cm$table[2,1]
[1] 10

Upvotes: 8

Related Questions