FVR: Feature variance reduction for post-hoc network calibration
- Authors
- Noh, Jongyoun; Park, Hyekang; Ham, Bumsub
- Issue Date
- 2026-05
- Publisher
- Pergamon Press
- Citation
- Pattern Recognition, v.173
- Abstract
- Adjusting miscalibrated probabilities in deep neural networks is important for serving them on real-world applications. Train-time calibration methods typically outperform post-hoc approaches, yet at the expense of computational costs to train the networks from scratch. We present in this paper a surprisingly simple yet highly effective post-hoc calibration approach using a pruning technique. We have found that neural networks calibrated using train-time methods provide significantly smaller feature variance and skewness. This suggests that the poorly-calibrated networks tend to have a few extremely high activations, and they primarily contribute to overconfident predictions. To address this, we propose to eliminate the overly-activated features, reducing the feature variance and skewness of networks. We also introduce feature adjustment techniques for the pruned features to improve the accuracies of the networks after pruning. Extensive experimental results on standard benchmarks demonstrate that our post-hoc approach to network calibration largely outperforms current post-hoc methods. We also show that it even provides competitive results with the train-time calibration methods, while retaining the merit of efficient post-hoc approaches.
- Keywords
- Post-hoc calibration; Network calibration; Feature pruning
- ISSN
- 0031-3203
- URI
- https://pubs.kist.re.kr/handle/201004/154163
- DOI
- 10.1016/j.patcog.2025.112913
- Appears in Collections:
- KIST Article > 2026
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.