ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.17121
  4. Cited By
Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?

Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?

20 June 2025
Adithya Bhaskar
Alexander Wettig
Tianyu Gao
Yihe Dong
Danqi Chen
ArXiv (abs)PDFHTML

Papers citing "Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?"

3 / 3 papers shown
Which Heads Matter for Reasoning? RL-Guided KV Cache Compression
Which Heads Matter for Reasoning? RL-Guided KV Cache Compression
Wenjie Du
Li Jiang
Keda Tao
Xue Liu
Huan Wang
LRM
121
0
0
09 Oct 2025
EpiCache: Episodic KV Cache Management for Long Conversational Question Answering
EpiCache: Episodic KV Cache Management for Long Conversational Question Answering
Minsoo Kim
Arnav Kundu
Han-Byul Kim
Richa Dixit
Minsik Cho
196
0
0
22 Sep 2025
MachineLearningLM: Scaling Many-shot In-context Learning via Continued Pretraining
MachineLearningLM: Scaling Many-shot In-context Learning via Continued Pretraining
Haoyu Dong
Pengkun Zhang
Mingzhe Lu
Yanzhen Shen
Guolin Ke
ReLMLRM
455
3
0
08 Sep 2025
1
Page 1 of 1