ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.00744
  4. Cited By
Blending Complementary Memory Systems in Hybrid Quadratic-Linear Transformers
v1v2 (latest)

Blending Complementary Memory Systems in Hybrid Quadratic-Linear Transformers

31 May 2025
Kazuki Irie
Morris Yau
Samuel J. Gershman
ArXiv (abs)PDFHTMLGithub (12★)

Papers citing "Blending Complementary Memory Systems in Hybrid Quadratic-Linear Transformers"

3 / 3 papers shown
Title
Artificial Hippocampus Networks for Efficient Long-Context Modeling
Artificial Hippocampus Networks for Efficient Long-Context Modeling
Yunhao Fang
Weihao Yu
Shu Zhong
Qinghao Ye
Xuehan Xiong
Lai Wei
124
1
0
08 Oct 2025
The End of Transformers? On Challenging Attention and the Rise of Sub-Quadratic Architectures
The End of Transformers? On Challenging Attention and the Rise of Sub-Quadratic Architectures
Alexander Fichtl
Jeremias Bohn
Josefin Kelber
Edoardo Mosca
Georg Groh
104
0
0
06 Oct 2025
Fast weight programming and linear transformers: from machine learning to neurobiology
Fast weight programming and linear transformers: from machine learning to neurobiology
Kazuki Irie
Samuel J. Gershman
124
0
0
11 Aug 2025
1