51
0

Asymmetric Duos: Sidekicks Improve Uncertainty

Main:9 Pages
14 Figures
Bibliography:4 Pages
Appendix:11 Pages
Abstract

The go-to strategy to apply deep networks in settings where uncertainty informs decisions--ensembling multiple training runs with random initializations--is ill-suited for the extremely large-scale models and practical fine-tuning workflows of today. We introduce a new cost-effective strategy for improving the uncertainty quantification and downstream decisions of a large model (e.g. a fine-tuned ViT-B): coupling it with a less accurate but much smaller "sidekick" (e.g. a fine-tuned ResNet-34) with a fraction of the computational cost. We propose aggregating the predictions of this \emph{Asymmetric Duo} by simple learned weighted averaging. Surprisingly, despite their inherent asymmetry, the sidekick model almost never harms the performance of the larger model. In fact, across five image classification benchmarks and a variety of model architectures and training schemes (including soups), Asymmetric Duos significantly improve accuracy, uncertainty quantification, and selective classification metrics with only 1020%{\sim}10-20\% more computation.

View on arXiv
@article{zhou2025_2505.18636,
  title={ Asymmetric Duos: Sidekicks Improve Uncertainty },
  author={ Tim G. Zhou and Evan Shelhamer and Geoff Pleiss },
  journal={arXiv preprint arXiv:2505.18636},
  year={ 2025 }
}
Comments on this paper