ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.02393
73
0

Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers

4 February 2025
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
    LRM
ArXivPDFHTML
Abstract

Chain-of-thought reasoning and scratchpads have emerged as critical tools for enhancing the computational capabilities of transformers. While theoretical results show that polynomial-length scratchpads can extend transformers' expressivity from TC0TC^0TC0 to PTIMEPTIMEPTIME, their required length remains poorly understood. Empirical evidence even suggests that transformers need scratchpads even for many problems in TC0TC^0TC0, such as Parity or Multiplication, challenging optimistic bounds derived from circuit complexity. In this work, we initiate the study of systematic lower bounds for the number of CoT steps across different algorithmic problems, in the hard-attention regime. We study a variety of algorithmic problems, and provide bounds that are tight up to logarithmic factors. Overall, these results contribute to emerging understanding of the power and limitations of chain-of-thought reasoning.

View on arXiv
@article{amiri2025_2502.02393,
  title={ Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers },
  author={ Alireza Amiri and Xinting Huang and Mark Rofin and Michael Hahn },
  journal={arXiv preprint arXiv:2502.02393},
  year={ 2025 }
}
Comments on this paper