SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 511520 of 629 papers

TitleStatusHype
Multi-Label Out-of-Distribution Detection with Spectral Normalized Joint EnergyCode0
A Functional Data Perspective and Baseline On Multi-Layer Out-of-Distribution DetectionCode0
Learning by Erasing: Conditional Entropy based Transferable Out-Of-Distribution DetectionCode0
Detecting semantic anomaliesCode0
SELFOOD: Self-Supervised Out-Of-Distribution Detection via Learning to RankCode0
Self-Supervised Anomaly Detection by Self-Distillation and Negative SamplingCode0
Detecting Out-of-Distribution Through the Lens of Neural CollapseCode0
Layer Adaptive Deep Neural Networks for Out-of-distribution DetectionCode0
NCDD: Nearest Centroid Distance Deficit for Out-Of-Distribution Detection in Gastrointestinal VisionCode0
Large Class Separation is not what you need for Relational Reasoning-based OOD DetectionCode0
Show:102550
← PrevPage 52 of 63Next →

No leaderboard results yet.