SOTAVerified

Unleashing Mask: Explore the Intrinsic Out-of-Distribution Detection Capability

2023-06-06Code Available0· sign in to hype

Jianing Zhu, Hengzhuang Li, Jiangchao Yao, Tongliang Liu, Jianliang Xu, Bo Han

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Out-of-distribution (OOD) detection is an indispensable aspect of secure AI when deploying machine learning models in real-world applications. Previous paradigms either explore better scoring functions or utilize the knowledge of outliers to equip the models with the ability of OOD detection. However, few of them pay attention to the intrinsic OOD detection capability of the given model. In this work, we generally discover the existence of an intermediate stage of a model trained on in-distribution (ID) data having higher OOD detection performance than that of its final stage across different settings, and further identify one critical data-level attribution to be learning with the atypical samples. Based on such insights, we propose a novel method, Unleashing Mask, which aims to restore the OOD discriminative capabilities of the well-trained model with ID data. Our method utilizes a mask to figure out the memorized atypical samples, and then finetune the model or prune it with the introduced mask to forget them. Extensive experiments and analysis demonstrate the effectiveness of our method. The code is available at: https://github.com/tmlr-group/Unleashing-Mask.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ImageNet-1k vs Curated OODs (avg.)ODIN+UMAP (ResNet-50)FPR9540.94Unverified
ImageNet-1k vs iNaturalistODIN+UMAP (ResNet-50)AUROC94.71Unverified
ImageNet-1k vs PlacesODIN+UMAP (ResNet-50)FPR9550.06Unverified
ImageNet-1k vs SUNODIN+UMAP (ResNet-50)FPR9549.69Unverified
ImageNet-1k vs TexturesODIN+UMAP (ResNet-50)AUROC88.35Unverified

Reproductions