Revisiting Likelihood-Based Out-of-Distribution Detection by Modeling Representations
Yifan Ding, Arturas Aleksandrauskas, Amirhossein Ahmadian, Jonas Unger, Fredrik Lindsten, Gabriel Eilertsen
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/limchaos/likelihood-oodOfficialIn paperpytorch★ 5
Abstract
Out-of-distribution (OOD) detection is critical for ensuring the reliability of deep learning systems, particularly in safety-critical applications. Likelihood-based deep generative models have historically faced criticism for their unsatisfactory performance in OOD detection, often assigning higher likelihood to OOD data than in-distribution samples when applied to image data. In this work, we demonstrate that likelihood is not inherently flawed. Rather, several properties in the images space prohibit likelihood as a valid detection score. Given a sufficiently good likelihood estimator, specifically using the probability flow formulation of a diffusion model, we show that likelihood-based methods can still perform on par with state-of-the-art methods when applied in the representation space of pre-trained encoders. The code of our work can be found at https://github.com/limchaos/Likelihood-OOD.githttps://github.com/limchaos/Likelihood-OOD.git.