SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 8190 of 629 papers

TitleStatusHype
Self-Calibrated Tuning of Vision-Language Models for Out-of-Distribution DetectionCode1
Double Descent Meets Out-of-Distribution Detection: Theoretical Insights and Empirical Analysis on the role of model complexity0
MADOD: Generalizing OOD Detection to Unseen Domains via G-Invariance Meta-Learning0
'No' Matters: Out-of-Distribution Detection in Multimodality Long Dialogue0
Dimensionality-induced information loss of outliers in deep neural networks0
Long-Tailed Out-of-Distribution Detection via Normalized Outlier Distribution AdaptationCode0
PViT: Prior-augmented Vision Transformer for Out-of-distribution DetectionCode0
What If the Input is Expanded in OOD Detection?Code0
GDDA: Semantic OOD Detection on Graphs under Covariate Shift via Score-Based Diffusion Models0
LEGO-Learn: Label-Efficient Graph Open-Set LearningCode0
Show:102550
← PrevPage 9 of 63Next →

No leaderboard results yet.