SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 601610 of 629 papers

TitleStatusHype
Joint Learning of Domain Classification and Out-of-Domain Detection with Dynamic Class Weighting for Satisficing False Acceptance Rates0
Building Safe and Reliable AI systems for Safety Critical Tasks with Vision-Language Processing0
Bridging In- and Out-of-distribution Samples for Their Better Discriminability0
The Compact Support Neural Network0
The Conditional Entropy Bottleneck0
kFolden: k-Fold Ensemble for Out-Of-Distribution Detection0
KNN-Contrastive Learning for Out-of-Domain Intent Classification0
Enhancing Out-of-Distribution Detection with Multitesting-based Layer-wise Feature Fusion0
Label Smoothed Embedding Hypothesis for Out-of-Distribution Detection0
Language-Enhanced Latent Representations for Out-of-Distribution Detection in Autonomous Driving0
Show:102550
← PrevPage 61 of 63Next →

No leaderboard results yet.