Humans Disagree With the IoU for Measuring Object Detector Localization Error

More Info
expand_more

Abstract

The localization quality of automatic object detectors is typically evaluated by the Intersection over Union (IoU) score. In this work, we show that humans have a different view on localization quality. To evaluate this, we conduct a survey with more than 70 participants. Results show that for localization errors with the exact same IoU score, humans might not consider that these errors are equal, and express a preference. Our work is the first to evaluate IoU with humans and makes it clear that relying on IoU scores alone to evaluate localization errors might not be sufficient.

Files

Humans_Disagree_With_the_IoU_f... (pdf)
(pdf | 5.66 Mb)
- Embargo expired in 01-07-2023
Unknown license