ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.05591
  4. Cited By
On the Power of Convolution Augmented Transformer

On the Power of Convolution Augmented Transformer

8 July 2024
Mingchen Li
Xuechen Zhang
Yixiao Huang
Samet Oymak
ArXiv (abs)PDFHTML

Papers citing "On the Power of Convolution Augmented Transformer"

5 / 5 papers shown
Title
Revisiting associative recall in modern recurrent models
Revisiting associative recall in modern recurrent models
Destiny Okpekpe
Antonio Orvieto
72
0
0
26 Aug 2025
Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity
Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity
Ningyuan Huang
Miguel Sarabia
Abhinav Moudgil
P. Rodríguez
Luca Zappella
Federico Danieli
Mamba
173
1
0
13 Jun 2025
Extrapolation by Association: Length Generalization Transfer in Transformers
Extrapolation by Association: Length Generalization Transfer in Transformers
Ziyang Cai
Nayoung Lee
Avi Schwarzschild
Samet Oymak
Dimitris Papailiopoulos
194
6
0
10 Jun 2025
Mechanistic evaluation of Transformers and state space models
Mechanistic evaluation of Transformers and state space models
Aryaman Arora
Neil Rathi
Nikil Roashan Selvam
Róbert Csordás
Dan Jurafsky
Christopher Potts
355
3
0
21 May 2025
Samba: Simple Hybrid State Space Models for Efficient Unlimited Context Language Modeling
Samba: Simple Hybrid State Space Models for Efficient Unlimited Context Language Modeling
Liliang Ren
Yang Liu
Yadong Lu
Haoran Pan
Chen Liang
Weizhu Chen
Mamba
291
108
0
11 Jun 2024
1