alongova
alongova

Reputation: 35

How to calculate Accuracy F1-Score Precision Sensitivity etc

I'm trying to learn R. So I found some practices on internet this is one of them. I want to calculate Accuracy, F1-Score, Precision, Sensitivity etc from this code. But I can't even calculate confusionmatrix. What should i do? any one help

net = neuralnet(formul,data=train_data,hidden=5,linear.output=FALSE)
plot(net)
predict_net_test <- compute(net,test_data[,1:9])
predict_result<-round(predict_net_test$net.result, digits = 0)
net.prediction = c("benign", "malignant")[apply(predict_result, 1, which.max)]
predict.table = table(cleanedData$Class[-index], net.prediction)
predict.table

CrossTable(x = cleanedData$Class[-index], y = net.prediction,
       prop.chisq=FALSE)

Upvotes: 1

Views: 1010

Answers (1)

StupidWolf
StupidWolf

Reputation: 46888

Not very sure why you are calling the actual labels from another dataframe cleanedData and also the packages you are using. Please provide them in the future. You have the confusion matrix, just feed that into caret's confusionMatrix() for stats, for example:

library(caret)
library(neuralnet)

dat = data.frame(matrix(runif(1000),100))
dat$Class = sample(c("benign", "malignant"),100,replace=TRUE)
dat$Class = factor(dat$Class)

train_data = dat[1:70,]
test_data = dat[71:100,]

net = neuralnet(Class ~ .,data=train_data,hidden=5,linear.output=FALSE)
predict_net_test = c("benign", "malignant")[max.col(predict(net,test_data))]

You need to put the prediction first:

predict.table = table(predict_net_test,test_data$Class)

Then:

confusionMatrix(predict.table,positive="malignant")
                
predict_net_test benign malignant
       benign         5         7
       malignant     10         8
                                          
               Accuracy : 0.4333          
                 95% CI : (0.2546, 0.6257)
    No Information Rate : 0.5             
    P-Value [Acc > NIR] : 0.8192          
                                          
                  Kappa : -0.1333         
                                          
 Mcnemar's Test P-Value : 0.6276          
                                          
            Sensitivity : 0.5333          
            Specificity : 0.3333          
         Pos Pred Value : 0.4444          
         Neg Pred Value : 0.4167          
             Prevalence : 0.5000          
         Detection Rate : 0.2667          
   Detection Prevalence : 0.6000          
      Balanced Accuracy : 0.4333          
                                          
       'Positive' Class : malignant 

For precision recall, do:

    confusionMatrix(predict.table,positive="malignant",mode = "prec_recall")

Confusion Matrix and Statistics

                
predict_net_test benign malignant
       benign         3         8
       malignant     10         9
                                         
               Accuracy : 0.4            
                 95% CI : (0.2266, 0.594)
    No Information Rate : 0.5667         
    P-Value [Acc > NIR] : 0.9782         
                                         
                  Kappa : -0.2442        
                                         
 Mcnemar's Test P-Value : 0.8137         
                                         
              Precision : 0.4737         
                 Recall : 0.5294         
                     F1 : 0.5000         
             Prevalence : 0.5667         
         Detection Rate : 0.3000         
   Detection Prevalence : 0.6333         
      Balanced Accuracy : 0.3801         
                                         
       'Positive' Class : malignant   

Upvotes: 1

Related Questions