Papers
Communities
Organizations
Events
Blog
Pricing
Feedback
Contact Sales
Search
Open menu
Home
Papers
2502.10835
Cited By
Back Attention: Understanding and Enhancing Multi-Hop Reasoning in Large Language Models
15 February 2025
Zeping Yu
Yonatan Belinkov
Sophia Ananiadou
LRM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Back Attention: Understanding and Enhancing Multi-Hop Reasoning in Large Language Models"
5 / 5 papers shown
Title
Limit Analysis for Symbolic Multi-step Reasoning Tasks with Information Propagation Rules Based on Transformers
Tian Qin
Yuhan Chen
Zhiwei Wang
Zhi-Qin John Xu
LRM
0
0
0
27 Sep 2025
AudioLens: A Closer Look at Auditory Attribute Perception of Large Audio-Language Models
Chih-Kai Yang
Neo Ho
Yi-Jyun Lee
Hung-yi Lee
AuLLM
192
0
0
05 Jun 2025
Internal Chain-of-Thought: Empirical Evidence for Layer-wise Subtask Scheduling in LLMs
Zhipeng Yang
Junzhuo Li
Siyu Xia
Xuming Hu
AIFin
LRM
183
2
0
20 May 2025
CaKE: Circuit-aware Editing Enables Generalizable Knowledge Learners
Yunzhi Yao
Jizhan Fang
Jia-Chen Gu
N. Zhang
Shumin Deng
Ningyu Zhang
Nanyun Peng
KELM
175
4
0
20 Mar 2025
Stop Overthinking: A Survey on Efficient Reasoning for Large Language Models
Yang Sui
Yu-Neng Chuang
Guanchu Wang
Jiamu Zhang
Tianyi Zhang
...
Andrew Wen
Shaochen
Zhong
Hanjie Chen
Helen Zhou
OffRL
ReLM
LRM
283
164
0
20 Mar 2025
1