SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 2129 of 29 papers

TitleStatusHype
Better Classifier Calibration for Small Data Sets0
Binary Classifier Calibration: Bayesian Non-Parametric Approach0
Binary Classifier Calibration: Non-parametric approach0
Binary Classifier Calibration using an Ensemble of Near Isotonic Regression Models0
Classifier Calibration: A survey on how to assess and improve predicted class probabilities0
Classifier Calibration with ROC-Regularized Isotonic Regression0
Decoupling Decision-Making in Fraud Prevention through Classifier Calibration for Business Logic Action0
FedSA: A Unified Representation Learning via Semantic Anchors for Prototype-based Federated Learning0
High Frequency Residual Learning for Multi-Scale Image Classification0
Show:102550
← PrevPage 3 of 3Next →

No leaderboard results yet.