SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 110 of 29 papers

TitleStatusHype
PrePrompt: Predictive prompting for class incremental learningCode1
Long-tailed Medical Diagnosis with Relation-aware Representation Learning and Iterative Classifier CalibrationCode0
FedSA: A Unified Representation Learning via Semantic Anchors for Prototype-based Federated Learning0
Enhancing Generalized Few-Shot Semantic Segmentation via Effective Knowledge TransferCode0
Improved User Identification through Calibrated Monte-Carlo DropoutCode0
Accuracy-Preserving Calibration via Statistical Modeling on Probability SimplexCode0
Decoupling Decision-Making in Fraud Prevention through Classifier Calibration for Business Logic Action0
Classifier Calibration with ROC-Regularized Isotonic Regression0
No Fear of Classifier Biases: Neural Collapse Inspired Federated Learning with Synthetic and Fixed ClassifierCode1
Expeditious Saliency-guided Mix-up through Random Gradient ThresholdingCode0
Show:102550
← PrevPage 1 of 3Next →

No leaderboard results yet.