Felipe Moura Oliveira
Felipe Moura Oliveira

Reputation: 75

confusionMatrix in R

I'm studying pattern recognition, so I make 2 classes of data and separated it using my model. My data only can assume two values, true and false.

for plot my results I used confusionMatrix, and when I was interpreting the result a doubt happens.

Can confusionMatrix give me a false accuracy ? For example:

I have 10 itens, 5 true and 5 false, my classifier predict 8 correct and 2 wrongs,so 1 of wrong should be true and was classified was false and other item should be false and was true. In this case the result are 5 true and 5 false. in "Help" of R Studio I cannot see if confusionMatrix compare item by item or only sum of possibles results.

Upvotes: 1

Views: 1755

Answers (2)

Felipe Moura Oliveira
Felipe Moura Oliveira

Reputation: 75

I´m using confusionMatrix from library "RSNNS".

I make a simple example to test and understand better how confucionMatrix from RSNNS works.

rm(list = ls())
library("RSNNS")


targetsDados <- 1*c(1,0,1,0,1,0,1,0,1,0)
targetsPredictions <- 1*c(0,1,1,0,1,0,1,0,1,0)


confusionMatrix(targetsDados,targetsPredictions)

targetsPredictions have 2 differents values, but same number of '0' and '1' than targetsDados.

The result of this script is:

       predictions
targets 0 1
      0 4 1
      1 1 4

So confusionMatrix give to me how many predictions are wrong, comparing item by item.

Upvotes: 2

Learner_seeker
Learner_seeker

Reputation: 544

What do you mean by false accuracy ? Do you imply 'False Positive' ? Given your case the confusion matrix Looks something like( A represents Actual , P represents model predicts ) :

     A.T  A.F
P.T  4    1 
P.F  1    4 

Now there are multiple things you can calculate here :

True positive Rate ( Precision ) = 4/5

True negative Rate ( i think this is what you are looking for ) = 4/5

# where model got wrong 

False Positive Rate = 1/5

False Negative Rate = 1/5

Accuracy ( overall what it got right ) = 8/10

#to get the above ( not using confusion matrix from `caret` )

a=4 # correct positives
b=1 # incorrect positive
c=4 # correct negative 
d=1 # incorrect negative 

TPR = a/(a+b)
TNR = d/(c+d)
FPR = b/(b+d)
FNR = c/(a+c)
Accuracy = (a+d)/(a+b+c+d)

Upvotes: 0

Related Questions