Reputation: 11
I'm running a model on AWS SageMaker, using their example object detection Jupyter notebook (https://github.com/awslabs/amazon-sagemaker-examples/blob/master/introduction_to_amazon_algorithms/object_detection_pascalvoc_coco/object_detection_recordio_format.ipynb). In the results it gives the following:
validation mAP =(0.111078678154)
I was wondering what this mAP score is referring to?
I've used tensorflow, where it gives an averaged mAP(averages from .5IoU to .95IoU with .05 increments), [email protected], [email protected]. I've checked the documents on SageMaker, but cannot find anything referring to what the definition of mAP is.
Is it safe to assume that the mAP score SageMaker reports is the "averaged mAP(averages from .5IoU to .95IoU with .05 increments)"?
Upvotes: 1
Views: 362
Reputation: 21
Heyo,
The mAP score is the mean average precision score that is widely used for object detection (https://docs.aws.amazon.com/sagemaker/latest/dg/object-detection-tuning.html)
Take a look at this link for more info on mAP: https://medium.com/@jonathan_hui/map-mean-average-precision-for-object-detection-45c121a31173
Upvotes: 2