SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 2129 of 29 papers

TitleStatusHype
Hidden Heterogeneity: When to Choose Similarity-Based CalibrationCode0
Classifier Calibration: A survey on how to assess and improve predicted class probabilities0
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID DataCode0
Classifier Calibration: with application to threat scores in cybersecurityCode0
Better Classifier Calibration for Small Data Sets0
High Frequency Residual Learning for Multi-Scale Image Classification0
Binary Classifier Calibration using an Ensemble of Near Isotonic Regression Models0
Binary Classifier Calibration: Non-parametric approach0
Binary Classifier Calibration: Bayesian Non-Parametric Approach0
Show:102550
← PrevPage 3 of 3Next →

No leaderboard results yet.