SOTAVerified

A Theoretical and Practical Framework for Evaluating Uncertainty Calibration in Object Detection

2023-09-01Code Available0· sign in to hype

Pedro Conde, Rui L. Lopes, Cristiano Premebida

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The proliferation of Deep Neural Networks has resulted in machine learning systems becoming increasingly more present in various real-world applications. Consequently, there is a growing demand for highly reliable models in many domains, making the problem of uncertainty calibration pivotal when considering the future of deep learning. This is especially true when considering object detection systems, that are commonly present in safety-critical applications such as autonomous driving, robotics and medical diagnosis. For this reason, this work presents a novel theoretical and practical framework to evaluate object detection systems in the context of uncertainty calibration. This encompasses a new comprehensive formulation of this concept through distinct formal definitions, and also three novel evaluation metrics derived from such theoretical foundation. The robustness of the proposed uncertainty calibration metrics is shown through a series of representative experiments.

Tasks

Reproductions