58
0

BEEM: Boosting Performance of Early Exit DNNs using Multi-Exit Classifiers as Experts

Abstract

Early Exit (EE) techniques have emerged as a means to reduce inference latency in Deep Neural Networks (DNNs). The latency improvement and accuracy in these techniques crucially depend on the criteria used to make exit decisions. We propose a new decision criterion where exit classifiers are treated as experts BEEM and aggregate their confidence scores. The confidence scores are aggregated only if neighbouring experts are consistent in prediction as the samples pass through them, thus capturing their ensemble effect. A sample exits when the aggregated confidence value exceeds a threshold. The threshold is set using the error rates of the intermediate exits aiming to surpass the performance of conventional DNN inference. Experimental results on the COCO dataset for Image captioning and GLUE datasets for various language tasks demonstrate that our method enhances the performance of state-of-the-art EE methods, achieving improvements in speed-up by a factor 1.5x to 2.1x. When compared to the final layer, its accuracy is comparable in harder Image Captioning and improves in the easier language tasks. The source code for this work is publicly available atthis https URL

View on arXiv
@article{bajpai2025_2502.00745,
  title={ BEEM: Boosting Performance of Early Exit DNNs using Multi-Exit Classifiers as Experts },
  author={ Divya Jyoti Bajpai and Manjesh Kumar Hanawal },
  journal={arXiv preprint arXiv:2502.00745},
  year={ 2025 }
}
Comments on this paper