Solomon
Solomon

Reputation: 1

Why I am getting different results of Precision, Recall, f1-score and accuracy using formula and confusion matrix

Test-1: I used the following formulas to get Precision, Recall, f1-score and Accuracy:

precision = precision_score(y_test, prediction_t, average='micro')

recall = recall_score(y_test, prediction_t, average='micro')
    f1 = f1_score(y_test, prediction_t, average='micro')
    print(f"Precision: {precision:.2f}")
    print(f"Recall: {recall:.2f}")
    print(f"F1-score: {f1:.2f}")
    print('Accuracy Score: ', accuracy_score(y_test, prediction_t))
    print('Hamming Loss: ', hamming_loss(y_test, prediction_t))

Test-2: I also used the following formula to check my previous Test-1:

cm = confusion_matrix(y_test, prediction_t)
print(cm)
print(accuracy_score(y_test, prediction_t))
print(classification_report (y_test, prediction_t))

But I got very different results. For example on Test-1 I got accuracy 0.10, but on Test-2 I got accuracy of 0.57 Shouldn't I get the same result? Is it because Test-2 involved division by zeroes?

Upvotes: 0

Views: 19

Answers (0)

Related Questions