The Guy with The Hat
The Guy with The Hat

Reputation: 11132

Unexpected 1.0000 top_k_categorical_accuracy

I'm training a classification model, and getting some weird metric values. The top1 accuracy is fairly low and has relatively normal behavior, but the top_k_categorical_accuracy (with the default k=5) is often exactly 1.0000. This seems highly implausible, given how low the top1 accuracy is. What could be going on here?

Upvotes: 1

Views: 226

Answers (1)

The Guy with The Hat
The Guy with The Hat

Reputation: 11132

This is due to the model predicting 0.00000000 for most categories for a given example. Keras uses in_top_k for calculating the top_k_categorical_accuracy metric. In the documentation:

Note that the behavior of InTopK differs from the TopK op in its handling of ties; if multiple classes have the same prediction value and straddle the top-k boundary, all of those classes are considered to be in the top k.

So all predictions of 0 are tied, and will count as part of the top "5" as long as 4 or fewer predictions are nonzero. This means that all 200 classes count as the top 5, and thus we get 1.0000 accuracy for this metric.

Relevant GitHub issue: #10767

Upvotes: 1

Related Questions