SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 2129 of 29 papers

TitleStatusHype
Long-tailed Medical Diagnosis with Relation-aware Representation Learning and Iterative Classifier CalibrationCode0
Hidden Heterogeneity: When to Choose Similarity-Based CalibrationCode0
Expeditious Saliency-guided Mix-up through Random Gradient ThresholdingCode0
Enhancing Generalized Few-Shot Semantic Segmentation via Effective Knowledge TransferCode0
Class-wise and reduced calibration methodsCode0
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID DataCode0
Packed-Ensembles for Efficient Uncertainty EstimationCode0
Classifier Calibration: with application to threat scores in cybersecurityCode0
Accuracy-Preserving Calibration via Statistical Modeling on Probability SimplexCode0
Show:102550
← PrevPage 3 of 3Next →

No leaderboard results yet.