SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 2129 of 29 papers

TitleStatusHype
Masksembles for Uncertainty EstimationCode1
How Well Do Self-Supervised Models Transfer?Code1
Multivariate Confidence Calibration for Object DetectionCode1
Better Classifier Calibration for Small Data Sets0
High Frequency Residual Learning for Multi-Scale Image Classification0
Multi-class probabilistic classification using inductive and cross Venn-Abers predictorsCode1
Binary Classifier Calibration using an Ensemble of Near Isotonic Regression Models0
Binary Classifier Calibration: Non-parametric approach0
Binary Classifier Calibration: Bayesian Non-Parametric Approach0
Show:102550
← PrevPage 3 of 3Next →

No leaderboard results yet.