Reputation: 31
I'm trying to use deeplab for semantic segmentation. I'd like to calculate IOU per class(IOU for person only) instead of mean IOU.
At L142 of https://github.com/tensorflow/models/blob/master/research/deeplab/eval.py, I tried to get confusion matrix instead of mean IOU by
miou, cmat = tf.metrics.mean_iou(...)
metric_map['cmat'] = cmat
but it did not work. I'd appreciate if someone suggest me how to get around.
Upvotes: 2
Views: 3507
Reputation: 1
I implemented a class-specific IoU metric for this very purpose based on the MeanIoU class.
class ClassIoU(tf.keras.metrics.MeanIoU):
"""Computes the class-specific Intersection-Over-Union metric.
IOU is defined as follows:
IOU = true_positive / (true_positive + false_positive + false_negative).
The predictions are accumulated in a confusion matrix, weighted by
`sample_weight` and the metric is then calculated from it.
If `sample_weight` is `None`, weights default to 1.
Use `sample_weight` of 0 to mask values.
Args:
class_idx: The index of the the class of interest
one_hot: Indicates if the input is a one_hot vector as in CategoricalCrossentropy or if the class indices
are used as in SparseCategoricalCrossentropy or MeanIoU
num_classes: The possible number of labels the prediction task can have.
This value must be provided, since a confusion matrix of dimension =
[num_classes, num_classes] will be allocated.
name: (Optional) string name of the metric instance.
dtype: (Optional) data type of the metric result.
"""
def __init__(self, class_idx, one_hot, num_classes, name=None, dtype=None):
super().__init__(num_classes, name, dtype)
self.one_hot = one_hot
self.class_idx = class_idx
def result(self):
sum_over_row = tf.cast(
tf.reduce_sum(self.total_cm, axis=0), dtype=self._dtype)
sum_over_col = tf.cast(
tf.reduce_sum(self.total_cm, axis=1), dtype=self._dtype)
true_positives = tf.cast(
tf.linalg.diag_part(self.total_cm), dtype=self._dtype)
# sum_over_row + sum_over_col =
# 2 * true_positives + false_positives + false_negatives.
denominator = sum_over_row[self.class_idx] + sum_over_col[self.class_idx] \
- true_positives[self.class_idx]
# The mean is only computed over classes that appear in the
# label or prediction tensor. If the denominator is 0, we need to
# ignore the class.
num_valid_entries = tf.reduce_sum(
tf.cast(tf.not_equal(denominator, 0), dtype=self._dtype))
iou = tf.math.divide_no_nan(true_positives[self.class_idx], denominator)
return tf.math.divide_no_nan(
tf.reduce_sum(iou, name='mean_iou'), num_valid_entries)
def update_state(self, y_true, y_pred, sample_weight=None):
if self.one_hot:
return super().update_state(tf.argmax(y_true, axis=-1), tf.argmax(y_pred, axis=-1), sample_weight)
else:
return super().update_state(y_true, y_pred, sample_weight)
Upvotes: 0
Reputation: 143
You can use _streaming_confusion_matrix
from tensorflow.python.ops.metrics_impl
to get the confusion matrix.
Essentially it works the same way as other running metrics like mean_iou
. which means, you get two ops when calling this metric, a total confusion_matrix op and an update op that updates the confusion matrix cumulatively.
With the confusion matrix, now you should be able to compute the class wise iou
Upvotes: 2