redshellspy
redshellspy

Reputation: 36

Vertex-AI AutoML Average Precision Metric for Object Detection

I trained a object detection model (approx 1400 training images, 180 validation and 180 test images, with 6 classes and bounding boxes annoation) using vertex-ai automl feature. I have followed this link and everything went smooth and I could be able to train the model (using automl as the method). The evaluation results are shown below

evaluation results

and also I am attaching the precision recall curves. pr curves

Here, I have some queries on the metrics

  1. Average Precision is shown as 0.595. Is it calculated only at one IoU threshold 0.5 or multiple thresholds like usually reported COCO metric (AP@[0.5, 0.95, 0.05]) or any another way ?
  2. If I change the IoU threshold (using the bar), Average Precision is not changing. I wanted to ask why this is the case ?
  3. The precision and recall are reported at the given confidence and IoU threshold right ? (I am pretty sure this is the case but wanted to confirm one more time)

Do let me know if you need more details from my side.

Thanks.

Upvotes: 1

Views: 436

Answers (1)

Sakshi Gatyan
Sakshi Gatyan

Reputation: 2116

Consider the following answers:

  1. IoU threshold is a way to calculate the prediction score ie. to objectively judge whether the model predicted the box location correctly or not, this threshold is used.

    If the model predicts a box with an IoU score greater than or equal to the threshold, then there is a high overlap between the predicted box and one of the ground-truth boxes. This suggests that the model was able to detect an object successfully. So in your case, the model is predicting images with an IoU threshold of 0.5.

  2. Metrics generally change with the change in thresholds, I would advise to ask this on Google Cloud support/Issuetracker thread.

  3. Yes, you're right.

Upvotes: 3

Related Questions