ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.13745
  4. Cited By
EdgeTran: Co-designing Transformers for Efficient Inference on Mobile
  Edge Platforms

EdgeTran: Co-designing Transformers for Efficient Inference on Mobile Edge Platforms

24 March 2023
Shikhar Tuli
N. Jha
ArXivPDFHTML

Papers citing "EdgeTran: Co-designing Transformers for Efficient Inference on Mobile Edge Platforms"

4 / 4 papers shown
Title
TransCODE: Co-design of Transformers and Accelerators for Efficient
  Training and Inference
TransCODE: Co-design of Transformers and Accelerators for Efficient Training and Inference
Shikhar Tuli
N. Jha
17
4
0
27 Mar 2023
Energon: Towards Efficient Acceleration of Transformers Using Dynamic
  Sparse Attention
Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention
Zhe Zhou
Junling Liu
Zhenyu Gu
Guangyu Sun
56
39
0
18 Oct 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
214
7,687
0
17 Aug 2015
1