ZClassifier: Temperature Tuning and Manifold Approximation via KL Divergence on Logit Space
2025-07-14Code Available0· sign in to hype
Shim Soon Yong
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/ShimSoonYong/ZClassifierOfficialpytorch★ 1
Abstract
We introduce a novel classification framework, ZClassifier, that replaces conventional deterministic logits with diagonal Gaussian-distributed logits. Our method simultaneously addresses temperature scaling and manifold approximation by minimizing the Kullback-Leibler (KL) divergence between the predicted Gaussian distributions and a unit isotropic Gaussian. This unifies uncertainty calibration and latent control in a principled probabilistic manner, enabling a natural interpretation of class confidence and geometric consistency. Experiments on CIFAR-10 show that ZClassifier improves over softmax classifiers in robustness, calibration, and latent separation.