MobiRNN: Efficient Recurrent Neural Network Execution on Mobile GPU

MobiRNN: Efficient Recurrent Neural Network Execution on Mobile GPU

Papers citing "MobiRNN: Efficient Recurrent Neural Network Execution on Mobile GPU"

11 / 11 papers shown
EdgeMoE: Empowering Sparse Large Language Models on Mobile Devices
EdgeMoE: Empowering Sparse Large Language Models on Mobile DevicesIEEE Transactions on Mobile Computing (IEEE TMC), 2023
149
24
0
28 Aug 2023
Boosting DNN Cold Inference on Edge Devices
Boosting DNN Cold Inference on Edge DevicesACM SIGMOBILE International Conference on Mobile Systems, Applications, and Services (MobiSys), 2022
769
13
0
15 Jun 2022
Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge
  Computing
Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge ComputingIEEE Transactions on Wireless Communications (TWC), 2019
211
742
0
04 Oct 2019
Deep Learning in Mobile and Wireless Networking: A Survey
Deep Learning in Mobile and Wireless Networking: A SurveyIEEE Communications Surveys and Tutorials (COMST), 2018
350
1,417
0
12 Mar 2018