SOTAVerified

Model-free Test Time Adaptation for Out-Of-Distribution Detection

2023-11-28Unverified0· sign in to hype

Yifan Zhang, Xue Wang, Tian Zhou, Kun Yuan, Zhang Zhang, Liang Wang, Rong Jin, Tieniu Tan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Out-of-distribution (OOD) detection is essential for the reliability of ML models. Most existing methods for OOD detection learn a fixed decision criterion from a given in-distribution dataset and apply it universally to decide if a data point is OOD. Recent work~fang2022is shows that given only in-distribution data, it is impossible to reliably detect OOD data without extra assumptions. Motivated by the theoretical result and recent exploration of test-time adaptation methods, we propose a Non-Parametric Test Time Adaptation framework for Out-Of-Distribution Detection ( ). Unlike conventional methods, utilizes online test samples for model adaptation during testing, enhancing adaptability to changing data distributions. The framework incorporates detected OOD instances into decision-making, reducing false positive rates, particularly when ID and OOD distributions overlap significantly. We demonstrate the effectiveness of through comprehensive experiments on multiple OOD detection benchmarks, extensive empirical studies show that significantly improves the performance of OOD detection over state-of-the-art methods. Specifically, reduces the false positive rate (FPR95) by 23.23\% on the CIFAR-10 benchmarks and 38\% on the ImageNet-1k benchmarks compared to the advanced methods. Lastly, we theoretically verify the effectiveness of .

Tasks

Reproductions