Classifier calibration
Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).
Papers
Showing 11–20 of 29 papers
No leaderboard results yet.