ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.16053
  4. Cited By
LongMamba: Enhancing Mamba's Long Context Capabilities via Training-Free Receptive Field Enlargement

LongMamba: Enhancing Mamba's Long Context Capabilities via Training-Free Receptive Field Enlargement

22 April 2025
Zhifan Ye
Kejing Xia
Yonggan Fu
Xin Dong
Jihoon Hong
Xiangchi Yuan
Shizhe Diao
Jan Kautz
Pavlo Molchanov
Yingyan Lin
    Mamba
ArXivPDFHTML

Papers citing "LongMamba: Enhancing Mamba's Long Context Capabilities via Training-Free Receptive Field Enlargement"

2 / 2 papers shown
Title
Recall with Reasoning: Chain-of-Thought Distillation for Mamba's Long-Context Memory and Extrapolation
Recall with Reasoning: Chain-of-Thought Distillation for Mamba's Long-Context Memory and Extrapolation
Junyu Ma
Tianqing Fang
Z. Zhang
Hongming Zhang
Haitao Mi
Dong Yu
ReLM
RALM
LRM
38
0
0
06 May 2025
Shifting Long-Context LLMs Research from Input to Output
Yuhao Wu
Yushi Bai
Zhiqing Hu
Shangqing Tu
Ming Shan Hee
Juanzi Li
Roy Ka-Wei Lee
57
0
0
06 Mar 2025
1