SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 311320 of 629 papers

TitleStatusHype
Identity Curvature Laplace Approximation for Improved Out-of-Distribution DetectionCode0
VI-OOD: A Unified Representation Learning Framework for Textual Out-of-distribution DetectionCode0
WAIC, but Why? Generative Ensembles for Robust Anomaly DetectionCode0
Weak Distribution Detectors Lead to Stronger Generalizability of Vision-Language Prompt TuningCode0
What If the Input is Expanded in OOD Detection?Code0
When and How Does In-Distribution Label Help Out-of-Distribution Detection?Code0
XOOD: Extreme Value Based Out-Of-Distribution Detection For Image ClassificationCode0
ZClassifier: Temperature Tuning and Manifold Approximation via KL Divergence on Logit SpaceCode0
Zero-Shot Out-of-Distribution Detection with Feature CorrelationsCode0
Locally Most Powerful Bayesian Test for Out-of-Distribution Detection using Deep Generative Models0
Show:102550
← PrevPage 32 of 63Next →

No leaderboard results yet.