ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.12514
  4. Cited By
Reasoning by Superposition: A Theoretical Perspective on Chain of Continuous Thought
v1v2 (latest)

Reasoning by Superposition: A Theoretical Perspective on Chain of Continuous Thought

18 May 2025
Hanlin Zhu
Shibo Hao
Zhiting Hu
Jiantao Jiao
Stuart Russell
Yuandong Tian
    OffRLLRM
ArXiv (abs)PDFHTML

Papers citing "Reasoning by Superposition: A Theoretical Perspective on Chain of Continuous Thought"

4 / 4 papers shown
Title
A Little Depth Goes a Long Way: The Expressive Power of Log-Depth Transformers
A Little Depth Goes a Long Way: The Expressive Power of Log-Depth Transformers
William Merrill
Ashish Sabharwal
102
10
0
05 Mar 2025
Spectral Journey: How Transformers Predict the Shortest Path
Spectral Journey: How Transformers Predict the Shortest Path
Andrew Cohen
Andrey Gromov
Kaiyu Yang
Yuandong Tian
61
3
0
12 Feb 2025
GSM-Infinite: How Do Your LLMs Behave over Infinitely Increasing Context Length and Reasoning Complexity?
GSM-Infinite: How Do Your LLMs Behave over Infinitely Increasing Context Length and Reasoning Complexity?
Yang Zhou
Hongyi Liu
Zhuoming Chen
Yuandong Tian
Beidi Chen
LRM
112
14
0
07 Feb 2025
Token Assorted: Mixing Latent and Text Tokens for Improved Language Model Reasoning
Token Assorted: Mixing Latent and Text Tokens for Improved Language Model Reasoning
DiJia Su
Hanlin Zhu
Yingchen Xu
Jiantao Jiao
Yuandong Tian
Qinqing Zheng
LRM
134
22
0
05 Feb 2025
1