ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.12355
  4. Cited By
QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm

QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm

Annual Meeting of the Association for Computational Linguistics (ACL), 2025
14 June 2025
Qirui Zhou
Shaohui Peng
Weiqiang Xiong
Haixin Chen
Yuanbo Wen
Haochen Li
Ling Li
Qi Guo
Yongwei Zhao
Ke Gao
Ruizhi Chen
Yanjun Wu
Chen Zhao
Yihao Chen
    LRM
ArXiv (abs)PDFHTML

Papers citing "QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm"

2 / 2 papers shown
Title
QiMeng-Kernel: Macro-Thinking Micro-Coding Paradigm for LLM-Based High-Performance GPU Kernel Generation
QiMeng-Kernel: Macro-Thinking Micro-Coding Paradigm for LLM-Based High-Performance GPU Kernel Generation
Xinguo Zhu
Shaohui Peng
Jiaming Guo
Yunji Chen
Qi Guo
...
Qirui Zhou
Ke Gao
Yanjun Wu
Chen Zhao
Ling Li
72
1
0
25 Nov 2025
PRAGMA: A Profiling-Reasoned Multi-Agent Framework for Automatic Kernel Optimization
PRAGMA: A Profiling-Reasoned Multi-Agent Framework for Automatic Kernel Optimization
Kelun Lei
Hailong Yang
H. Zhang
Xin You
Kaige Zhang
Zhongzhi Luan
Yi Liu
Depei Qian
138
0
0
09 Nov 2025
1