Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2512.02403
168
0
Like
Bookmark
Open options
v1
v2 (latest)
ESACT: An End-to-End Sparse Accelerator for Compute-Intensive Transformers via Local Similarity
2 December 2025
Hongxiang Liu
Zhifang Deng
Tong Pu
Shengli Lu
Re-assign community
ArXiv (abs)
PDF
HTML
Main:
11 Pages
22 Figures
Bibliography:
2 Pages
Abstract
Transformers, composed of QKV generation, attention computation, and FFNs,
View on arXiv
BibTeX
Comments on this paper
Comment
Submit