SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 91100 of 629 papers

TitleStatusHype
Continual Learning Based on OOD Detection and Task MaskingCode1
How to Exploit Hyperspherical Embeddings for Out-of-Distribution Detection?Code1
Unknown-Aware Object Detection: Learning What You Don't Know from Videos in the WildCode1
MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasksCode1
Agree to Disagree: Diversity through Disagreement for Better TransferabilityCode1
Training OOD Detectors in their Natural HabitatsCode1
UQGAN: A Unified Model for Uncertainty Quantification of Deep Classifiers trained via Conditional GANsCode1
Adversarial vulnerability of powerful near out-of-distribution detectionCode1
WOOD: Wasserstein-based Out-of-Distribution DetectionCode1
Hyperdimensional Feature Fusion for Out-Of-Distribution DetectionCode1
Show:102550
← PrevPage 10 of 63Next →

No leaderboard results yet.