Reputation: 36
I trained a object detection model (approx 1400 training images, 180 validation and 180 test images, with 6 classes and bounding boxes annoation) using vertex-ai automl feature. I have followed this link and everything went smooth and I could be able to train the model (using automl
as the method). The evaluation results are shown below
and also I am attaching the precision recall curves.
Here, I have some queries on the metrics
0.595
. Is it calculated only at one IoU threshold 0.5 or multiple thresholds like usually reported COCO metric (AP@[0.5, 0.95, 0.05]) or any another way ?Do let me know if you need more details from my side.
Thanks.
Upvotes: 1
Views: 436
Reputation: 2116
Consider the following answers:
IoU threshold is a way to calculate the prediction score ie. to objectively judge whether the model predicted the box location correctly or not, this threshold is used.
If the model predicts a box with an IoU score greater than or equal to the threshold, then there is a high overlap between the predicted box and one of the ground-truth boxes. This suggests that the model was able to detect an object successfully. So in your case, the model is predicting images with an IoU threshold of 0.5.
Metrics generally change with the change in thresholds, I would advise to ask this on Google Cloud support/Issuetracker thread.
Yes, you're right.
Upvotes: 3