SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 2629 of 29 papers

TitleStatusHype
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID DataCode0
Packed-Ensembles for Efficient Uncertainty EstimationCode0
Classifier Calibration: with application to threat scores in cybersecurityCode0
Accuracy-Preserving Calibration via Statistical Modeling on Probability SimplexCode0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.