SOTAVerified

Out of Distribution (OOD) Detection

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Papers

Showing 110 of 629 papers

TitleStatusHype
MOODv2: Masked Image Modeling for Out-of-Distribution DetectionCode2
Training a Helpful and Harmless Assistant with Reinforcement Learning from Human FeedbackCode2
GalLoP: Learning Global and Local Prompts for Vision-Language ModelsCode2
Logits-Based FinetuningCode2
Recent Advances in OOD Detection: Problems and ApproachesCode2
Rethinking Test-time Likelihood: The Likelihood Path Principle and Its Application to OOD DetectionCode2
DPU: Dynamic Prototype Updating for Multimodal Out-of-Distribution DetectionCode2
Extremely Simple Multimodal Outlier Synthesis for Out-of-Distribution Detection and SegmentationCode2
Learning Transferable Negative Prompts for Out-of-Distribution DetectionCode2
Unifying Unsupervised Graph-Level Anomaly Detection and Out-of-Distribution Detection: A BenchmarkCode2
Show:102550
← PrevPage 1 of 63Next →

No leaderboard results yet.