SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 621629 of 629 papers

TitleStatusHype
Unsupervised Out-of-Distribution Detection by Maximum Classifier DiscrepancyCode0
Detecting semantic anomaliesCode0
Out-of-Distribution Detection Using Neural Rendering Generative Models0
Outlier Exposure with Confidence Control for Out-of-Distribution DetectionCode0
Contextual Out-of-Domain Utterance Handling With Counterfeit Data AugmentationCode0
Analysis of Confident-Classifiers for Out-of-distribution DetectionCode0
WAIC, but Why? Generative Ensembles for Robust Anomaly DetectionCode0
Out-of-domain Detection based on Generative Adversarial Network0
Joint Learning of Domain Classification and Out-of-Domain Detection with Dynamic Class Weighting for Satisficing False Acceptance Rates0
Show:102550
← PrevPage 63 of 63Next →

No leaderboard results yet.