Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2506.12355
Cited By
QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm
Annual Meeting of the Association for Computational Linguistics (ACL), 2025
14 June 2025
Qirui Zhou
Shaohui Peng
Weiqiang Xiong
Haixin Chen
Yuanbo Wen
Haochen Li
Ling Li
Qi Guo
Yongwei Zhao
Ke Gao
Ruizhi Chen
Yanjun Wu
Chen Zhao
Yihao Chen
LRM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm"
2 / 2 papers shown
Title
QiMeng-Kernel: Macro-Thinking Micro-Coding Paradigm for LLM-Based High-Performance GPU Kernel Generation
Xinguo Zhu
Shaohui Peng
Jiaming Guo
Yunji Chen
Qi Guo
...
Qirui Zhou
Ke Gao
Yanjun Wu
Chen Zhao
Ling Li
72
1
0
25 Nov 2025
PRAGMA: A Profiling-Reasoned Multi-Agent Framework for Automatic Kernel Optimization
Kelun Lei
Hailong Yang
H. Zhang
Xin You
Kaige Zhang
Zhongzhi Luan
Yi Liu
Depei Qian
138
0
0
09 Nov 2025
1