ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15735
36
0

DistrEE: Distributed Early Exit of Deep Neural Network Inference on Edge Devices

6 February 2025
Xian Peng
Xin Wu
Lianming Xu
Li Wang
Aiguo Fei
ArXivPDFHTML
Abstract

Distributed DNN inference is becoming increasingly important as the demand for intelligent services at the network edge grows. By leveraging the power of distributed computing, edge devices can perform complicated and resource-hungry inference tasks previously only possible on powerful servers, enabling new applications in areas such as autonomous vehicles, industrial automation, and smart homes. However, it is challenging to achieve accurate and efficient distributed edge inference due to the fluctuating nature of the actual resources of the devices and the processing difficulty of the input data. In this work, we propose DistrEE, a distributed DNN inference framework that can exit model inference early to meet specific quality of service requirements. In particular, the framework firstly integrates model early exit and distributed inference for multi-node collaborative inferencing scenarios. Furthermore, it designs an early exit policy to control when the model inference terminates. Extensive simulation results demonstrate that DistrEE can efficiently realize efficient collaborative inference, achieving an effective trade-off between inference latency and accuracy.

View on arXiv
@article{peng2025_2502.15735,
  title={ DistrEE: Distributed Early Exit of Deep Neural Network Inference on Edge Devices },
  author={ Xian Peng and Xin Wu and Lianming Xu and Li Wang and Aiguo Fei },
  journal={arXiv preprint arXiv:2502.15735},
  year={ 2025 }
}
Comments on this paper