Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2503.09819
Cited By
Attention Reveals More Than Tokens: Training-Free Long-Context Reasoning with Attention-guided Retrieval
12 March 2025
Yuwei Zhang
Jayanth Srinivasa
Gaowen Liu
Jingbo Shang
LRM
LLMAG
RALM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Attention Reveals More Than Tokens: Training-Free Long-Context Reasoning with Attention-guided Retrieval"
5 / 5 papers shown
Title
StreamingThinker: Large Language Models Can Think While Reading
Junlong Tong
Yingqi Fan
Anhao Zhao
Yunpu Ma
Xiaoyu Shen
RALM
LRM
175
1
0
20 Oct 2025
Context Length Alone Hurts LLM Performance Despite Perfect Retrieval
Yufeng Du
Minyang Tian
S. Ronanki
Subendhu Rongali
S. Bodapati
Aram Galstyan
Azton Wells
Roy Schwartz
Eliu A. Huerta
Hao Peng
RALM
LRM
153
7
0
06 Oct 2025
PromptDistill: Query-based Selective Token Retention in Intermediate Layers for Efficient Large Language Model Inference
Weisheng Jin
Maojia Song
Tej Deep Pala
Yew Ken Chia
Amir Zadeh
Chuan Li
Soujanya Poria
VLM
235
0
0
30 Mar 2025
LongReason: A Synthetic Long-Context Reasoning Benchmark via Context Expansion
Zhan Ling
Kang Liu
Kai Yan
Yue Yang
Weijian Lin
Ting-Han Fan
Lingfeng Shen
Zhengyin Du
Jiecao Chen
ReLM
LRM
ELM
329
18
0
25 Jan 2025
HART: Efficient Visual Generation with Hybrid Autoregressive Transformer
International Conference on Learning Representations (ICLR), 2024
Haotian Tang
Yecheng Wu
Shang Yang
Enze Xie
Junsong Chen
Junyu Chen
Zhuoyang Zhang
Han Cai
Yaojie Lu
Song Han
371
77
0
14 Oct 2024
1