SOTAVerified

Simple and Effective Out-of-Distribution Detection via Cosine-based Softmax Loss

2023-01-01ICCV 2023Unverified0· sign in to hype

SoonCheol Noh, DongEon Jeong, Jee-Hyong Lee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Deep learning models need to detect out-of-distribution (OOD) data in the inference stage because they are trained to estimate the train distribution and infer the data sampled from the distribution. Many methods have been proposed, but they have some limitations, such as requiring additional data, input processing, or high computational cost. Moreover, most methods have hyperparameters to be set by users, which have a significant impact on the detection rate. We propose a simple and effective OOD detection method by combining the feature norm and the Mahalanobis distance obtained from classification models trained with the cosine-based softmax loss. Our method is practical because it does not use additional data for training, is about three times faster when inferencing than the methods using the input processing, and is easy to apply because it does not have any hyperparameters for OOD detection. We confirm that our method is superior to or at least comparable to state-of-the-art OOD detection methods through the experiments.

Tasks

Reproductions