SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 110 of 29 papers

TitleStatusHype
PrePrompt: Predictive prompting for class incremental learningCode1
No Fear of Classifier Biases: Neural Collapse Inspired Federated Learning with Synthetic and Fixed ClassifierCode1
FedFA: Federated Learning with Feature Anchors to Align Features and Classifiers for Heterogeneous DataCode1
Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without ForgettingCode1
Danish Fungi 2020 -- Not Just Another Image Recognition DatasetCode1
Masksembles for Uncertainty EstimationCode1
How Well Do Self-Supervised Models Transfer?Code1
Multivariate Confidence Calibration for Object DetectionCode1
Multi-class probabilistic classification using inductive and cross Venn-Abers predictorsCode1
Long-tailed Medical Diagnosis with Relation-aware Representation Learning and Iterative Classifier CalibrationCode0
Show:102550
← PrevPage 1 of 3Next →

No leaderboard results yet.