ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.21024
  4. Cited By
Pause Tokens Strictly Increase the Expressivity of Constant-Depth Transformers

Pause Tokens Strictly Increase the Expressivity of Constant-Depth Transformers

27 May 2025
Charles London
Varun Kanade
ArXiv (abs)PDFHTML

Papers citing "Pause Tokens Strictly Increase the Expressivity of Constant-Depth Transformers"

3 / 3 papers shown
On the Reasoning Abilities of Masked Diffusion Language Models
On the Reasoning Abilities of Masked Diffusion Language Models
Anej Svete
Ashish Sabharwal
DiffMLRM
111
0
0
15 Oct 2025
Thoughtbubbles: an Unsupervised Method for Parallel Thinking in Latent Space
Thoughtbubbles: an Unsupervised Method for Parallel Thinking in Latent Space
Houjun Liu
Shikhar Murty
Christopher D. Manning
Róbert Csordás
ReLMLRMAI4CE
160
1
0
30 Sep 2025
Emergence of Superposition: Unveiling the Training Dynamics of Chain of Continuous Thought
Emergence of Superposition: Unveiling the Training Dynamics of Chain of Continuous Thought
Hanlin Zhu
Shibo Hao
Zhiting Hu
Jiantao Jiao
Stuart Russell
Yuandong Tian
LRM
177
3
0
27 Sep 2025
1