SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 421430 of 629 papers

TitleStatusHype
Training OOD Detectors in their Natural HabitatsCode1
VOS: Learning What You Don't Know by Virtual Outlier SynthesisCode2
UQGAN: A Unified Model for Uncertainty Quantification of Deep Classifiers trained via Conditional GANsCode1
Out of Distribution Detection on ImageNet-OCode0
Adversarial vulnerability of powerful near out-of-distribution detectionCode1
Self-Supervised Anomaly Detection by Self-Distillation and Negative SamplingCode0
iDECODe: In-distribution Equivariance for Conformal Out-of-distribution Detection0
Deep Hybrid Models for Out-of-Distribution Detection0
Boundary Aware Learning for Out-of-distribution Detection0
Energy-bounded Learning for Robust Models of Code0
Show:102550
← PrevPage 43 of 63Next →

No leaderboard results yet.