SOTAVerified

Classifier calibration

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. The two common calibration metrics are Expected Calibration Error (ECE) and Maximum Calibration Error (MCE).

Papers

Showing 1120 of 29 papers

TitleStatusHype
FedFA: Federated Learning with Feature Anchors to Align Features and Classifiers for Heterogeneous DataCode1
Packed-Ensembles for Efficient Uncertainty EstimationCode0
Class-wise and reduced calibration methodsCode0
What is Your Metric Telling You? Evaluating Classifier Calibration under Context-Specific Definitions of Reliability0
Hidden Heterogeneity: When to Choose Similarity-Based CalibrationCode0
Classifier Calibration: A survey on how to assess and improve predicted class probabilities0
Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without ForgettingCode1
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID DataCode0
Danish Fungi 2020 -- Not Just Another Image Recognition DatasetCode1
Classifier Calibration: with application to threat scores in cybersecurityCode0
Show:102550
← PrevPage 2 of 3Next →

No leaderboard results yet.