SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 481490 of 629 papers

TitleStatusHype
Revealing the Distributional Vulnerability of Discriminators by Implicit GeneratorsCode0
Revisiting Deep Ensemble for Out-of-Distribution Detection: A Loss Landscape PerspectiveCode0
Revisiting Energy-Based Model for Out-of-Distribution DetectionCode0
TrustGAN: Training safe and trustworthy deep learning models through generative adversarial networksCode0
Building One-class Detector for Anything: Open-vocabulary Zero-shot OOD Detection Using Text-image ModelsCode0
Revisiting Likelihood-Based Out-of-Distribution Detection by Modeling RepresentationsCode0
Boosting Out-of-Distribution Detection with Multiple Pre-trained ModelsCode0
A Novel Data Augmentation Technique for Out-of-Distribution Sample Detection using Compounded CorruptionsCode0
Zero-Shot Out-of-Distribution Detection with Feature CorrelationsCode0
Revisit Overconfidence for OOD Detection: Reassigned Contrastive Learning with Adaptive Class-dependent ThresholdCode0
Show:102550
← PrevPage 49 of 63Next →

No leaderboard results yet.