SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 110 of 29 papers

TitleStatusHype
Multivariate Confidence Calibration for Object DetectionCode1
PrePrompt: Predictive prompting for class incremental learningCode1
Masksembles for Uncertainty EstimationCode1
Multi-class probabilistic classification using inductive and cross Venn-Abers predictorsCode1
Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without ForgettingCode1
No Fear of Classifier Biases: Neural Collapse Inspired Federated Learning with Synthetic and Fixed ClassifierCode1
FedFA: Federated Learning with Feature Anchors to Align Features and Classifiers for Heterogeneous DataCode1
How Well Do Self-Supervised Models Transfer?Code1
Danish Fungi 2020 -- Not Just Another Image Recognition DatasetCode1
Classifier Calibration: with application to threat scores in cybersecurityCode0
Show:102550
← PrevPage 1 of 3Next →

No leaderboard results yet.