ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.10284
  4. Cited By
Multi Resolution Analysis (MRA) for Approximate Self-Attention

Multi Resolution Analysis (MRA) for Approximate Self-Attention

International Conference on Machine Learning (ICML), 2022
21 July 2022
Zhanpeng Zeng
Sourav Pal
Jeffery Kline
G. Fung
Vikas Singh
ArXiv (abs)PDFHTMLGithub (8★)

Papers citing "Multi Resolution Analysis (MRA) for Approximate Self-Attention"

5 / 5 papers shown
Title
Log-Linear Attention
Log-Linear Attention
Han Guo
Songlin Yang
Tarushii Goel
Eric P. Xing
Tri Dao
Yoon Kim
Mamba
365
12
0
05 Jun 2025
LookupFFN: Making Transformers Compute-lite for CPU inference
LookupFFN: Making Transformers Compute-lite for CPU inferenceInternational Conference on Machine Learning (ICML), 2024
Zhanpeng Zeng
Michael Davies
Pranav Pulijala
Karthikeyan Sankaralingam
Vikas Singh
135
9
0
12 Mar 2024
Fast Multipole Attention: A Scalable Multilevel Attention Mechanism for Text and Images
Fast Multipole Attention: A Scalable Multilevel Attention Mechanism for Text and Images
Yanming Kang
Giang Tran
H. Sterck
226
9
0
18 Oct 2023
Vcc: Scaling Transformers to 128K Tokens or More by Prioritizing
  Important Tokens
Vcc: Scaling Transformers to 128K Tokens or More by Prioritizing Important TokensNeural Information Processing Systems (NeurIPS), 2023
Zhanpeng Zeng
Cole Hawkins
Min-Fong Hong
Aston Zhang
Nikolaos Pappas
Vikas Singh
Shuai Zheng
154
10
0
07 May 2023
Efficient Attention via Control Variates
Efficient Attention via Control VariatesInternational Conference on Learning Representations (ICLR), 2023
Lin Zheng
Jianbo Yuan
Chong-Jun Wang
Lingpeng Kong
245
21
0
09 Feb 2023
1