ZClassifier: Temperature Tuning and Manifold Approximation via KL Divergence on Logit Space
Main:8 Pages
8 Figures
Bibliography:3 Pages
8 Tables
Appendix:6 Pages
Abstract
We introduce a novel classification framework, ZClassifier, that replaces conventional deterministic logits with diagonal Gaussian-distributed logits. Our method simultaneously addresses temperature scaling and manifold approximation by minimizing the Kullback-Leibler (KL) divergence between the predicted Gaussian distributions and a unit isotropic Gaussian. This unifies uncertainty calibration and latent control in a principled probabilistic manner, enabling a natural interpretation of class confidence and geometric consistency. Experiments on CIFAR-10 show that ZClassifier improves over softmax classifiers in robustness, calibration, and latent separation.
View on arXivComments on this paper
