amankedia
amankedia

Reputation: 377

Calculating precision, recall and FScore from the results of a confusion matrix in R

I have got th following confusion matrix, now I need to calculate the precision, recall and FScore from it, how do I do that using the obtained values? Confusion Matrix and Statistics

      Reference
Prediction One Zero
      One   37   43
      Zero  19  131

               Accuracy : 0.7304          
                 95% CI : (0.6682, 0.7866)
    No Information Rate : 0.7565          
    P-Value [Acc > NIR] : 0.841087        

                  Kappa : 0.3611          
 Mcnemar's Test P-Value : 0.003489        

            Sensitivity : 0.6607          
            Specificity : 0.7529          
         Pos Pred Value : 0.4625          
         Neg Pred Value : 0.8733          
             Prevalence : 0.2435          
         Detection Rate : 0.1609          
   Detection Prevalence : 0.3478          
      Balanced Accuracy : 0.7068          

       'Positive' Class : One

I've used the following edited code after suggestions from other users

library(class)
library(e1071)
library(caret)
library(party)
library(nnet)
library(forecast)
pimad <- read.csv("C:/Users/USER/Desktop/AMAN/pimad.csv")
nrow(pimad)  
set.seed(9850)
gp<-runif(nrow(pimad))
pimad<-pimad[order(gp),]
idx <- createDataPartition(y = pimad$class, p = 0.7, list = FALSE)
train<-pimad[idx,]
test<-pimad[-idx,]
svmmodel<-svm(class~.,train,kernel="radial")
psvm<-predict(svmmodel,test)
table(psvm,test$class)
library(sos)
findFn("confusion matrix precision recall FScore")
df<-(confusionMatrix(test$class, psvm))
dim(df)
df[1,2]/sum(df[1,2:3])
df

Upvotes: 5

Views: 11755

Answers (3)

Tejas Desai
Tejas Desai

Reputation: 1

cm<-confusionMatrix(table(test_actual,test_predicted))
cm$byclass
cm$overall

NOTE 1: cm is confusionMatrix obtained by caret library of R

NOTE 2: cm$byclass gives: Sensitivity,Specificity,Pos Pred Value,Neg Pred Value, Precision, Recall,F1,Prevalence,Detection Rate Detection Prevalence Balanced Accuracy

NOTE 3: cm$overallgives Accuracy, Kappa, AccuracyLower, AccuracyUpper, AccuracyNull, AccuracyPValue

Upvotes: 0

r. ahmadi
r. ahmadi

Reputation: 61

Nothing else you need to do, you've got all the requested measures in df. Just type:

ls(df) [1] "byClass" "dots" "mode" "overall" "positive" "table"

df$byClass # This is another example I've worked on

Now all the parameters including sensitivity, specificity, pos pred val, neg pred val, precision, recall, F1, prevalence, detection rate, detection prevalence and balanced accuracy appears in a table

Upvotes: 6

DatamineR
DatamineR

Reputation: 9618

Well, it's simple calculation subsetting the matrix.

If your confusion matrix is called df, using the formulas here and here:

df
  Prediction One Zero
1        One  37   43
2       Zero  19  131

# Precision: tp/(tp+fp):
df[1,1]/sum(df[1,1:2])
[1] 0.4625

# Recall: tp/(tp + fn):
df[1,1]/sum(df[1:2,1])
[1] 0.6607143

# F-Score: 2 * precision * recall /(precision + recall):
2 * 0.4625 * 0.6607143 / (0.4625 + 0.6607143)
[1] 0.5441177

Upvotes: 1

Related Questions