SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 221230 of 629 papers

TitleStatusHype
Confidence-based Out-of-Distribution Detection: A Comparative Study and AnalysisCode0
Out of Distribution Detection on ImageNet-OCode0
Long-Tailed Out-of-Distribution Detection: Prioritizing Attention to TailCode0
Enhancing Few-Shot Out-of-Distribution Detection with Gradient Aligned Context OptimizationCode0
Out-Of-Distribution Detection for Audio-visual Generalized Zero-Shot Learning: A General FrameworkCode0
Long-Tailed Out-of-Distribution Detection via Normalized Outlier Distribution AdaptationCode0
Metric Learning and Adaptive Boundary for Out-of-Domain DetectionCode0
Multi-Label Out-of-Distribution Detection with Spectral Normalized Joint EnergyCode0
Likelihood Ratios and Generative Classifiers for Unsupervised Out-of-Domain Detection In Task Oriented DialogCode0
Confidence-Aware and Self-Supervised Image Anomaly LocalisationCode0
Show:102550
← PrevPage 23 of 63Next →

No leaderboard results yet.