Reputation: 157
Found this question by @CarstenWE but it had been closed with no answer: How to get classification report from the confusion matrix?
As the question is closed, I opened this question to provide an answer.
The questions linked to the original all have answers to compute precision, recall, and f1-score. However, none seems to use the classification_report
as the original question asked.
Upvotes: 0
Views: 670
Reputation: 158
I wrote a small function for generating a classification report from the confusion matrix and tested it with a few confusion matrices and it worked fine.
The time complexity of this function is Θ(n3) and might not be good for large data.
import numpy as np
from sklearn.metrics import classification_report, confusion_matrix
def cal_act_pred(cnf: np.ndarray) -> None:
act = []
pre = []
num = cnf.shape[0]
for i in range(num):
for j in range(num):
for k in range(cnf[i][j]):
act.append(i)
pre.append(j)
print(classification_report(act, pre, digits=4))
Upvotes: 1
Reputation: 81
Sklearn's classification report function takes these parameters :
sklearn.metrics.classification_report(y_true, y_pred, *, labels=None, target_names=None, sample_weight=None, digits=2, output_dict=False, zero_division='warn')
It gives a classification report with true labels et predicted labels.
If you still want to get a classification report from a confusion matrix, you might need to compute it from scratch without using Sklearn.
Upvotes: 0
Reputation: 157
I wrote a small function to do this using a confusion matrix as input, by creating a ground-truth vector and a predicted vector, as order does not matter for these metrics:
def classification_report_from_confusion_matrix(cm, **args):
y_true = []
y_pred = []
for target in range(len(cm)):
for pred in range(len(cm)):
y_true += [gt]*cm[target][pred]
y_pred += [pred]*cm[target][pred]
return metrics.classification_report(y_true , y_pred, **args)
This solution probably does not scale well for huge datasets, but it was enough for me.
Edit:
Here is a solution without using lists:
def classification_report_from_confusion_matrix(confusion_matrix, **args):
y_true = np.zeros(np.sum(confusion_matrix), dtype=int)
y_pred = np.copy(y_true)
i = 0
for target in range(len(confusion_matrix)):
for pred in range(len(confusion_matrix)):
n = confusion_matrix[target][pred]
y_true[i:i+n] = target
y_pred[i:i+n] = pred
i += n
return metrics.classification_report(y_true, y_pred, **args)
Upvotes: 0