ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15349
  4. Cited By
AttentionEngine: A Versatile Framework for Efficient Attention Mechanisms on Diverse Hardware Platforms

AttentionEngine: A Versatile Framework for Efficient Attention Mechanisms on Diverse Hardware Platforms

24 February 2025
Feiyang Chen
Yu Cheng
Lei Wang
Yuqing Xia
Ziming Miao
Lingxiao Ma
Fan Yang
Jinbao Xue
Zhi Yang
M. Yang
Huajun Chen
ArXiv (abs)PDFHTML

Papers citing "AttentionEngine: A Versatile Framework for Efficient Attention Mechanisms on Diverse Hardware Platforms"

2 / 2 papers shown
Title
PLM: Efficient Peripheral Language Models Hardware-Co-Designed for Ubiquitous Computing
PLM: Efficient Peripheral Language Models Hardware-Co-Designed for Ubiquitous Computing
Cheng Deng
Luoyang Sun
Jiwen Jiang
Yongcheng Zeng
Xinjian Wu
...
Haoyang Li
Lei Chen
Lionel M. Ni
Ning Yang
Jun Wang
804
2
0
15 Mar 2025
SeerAttention: Learning Intrinsic Sparse Attention in Your LLMs
SeerAttention: Learning Intrinsic Sparse Attention in Your LLMs
Yizhao Gao
Zhichen Zeng
Dayou Du
Shijie Cao
Hayden Kwok-Hay So
...
Junjie Lai
Mao Yang
Ting Cao
Fan Yang
M. Yang
446
66
0
17 Oct 2024
1