ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01405
  4. Cited By
On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding

On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding

2 October 2024
Kevin Xu
Issei Sato
ArXivPDFHTML

Papers citing "On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding"

3 / 3 papers shown
Title
Approximation Bounds for Transformer Networks with Application to Regression
Approximation Bounds for Transformer Networks with Application to Regression
Yuling Jiao
Yanming Lai
Defeng Sun
Yang Wang
Bokai Yan
24
0
0
16 Apr 2025
Enhancing Auto-regressive Chain-of-Thought through Loop-Aligned Reasoning
Enhancing Auto-regressive Chain-of-Thought through Loop-Aligned Reasoning
Qifan Yu
Zhenyu He
Sijie Li
Xun Zhou
Jun Zhang
Jingjing Xu
Di He
OffRL
LRM
75
4
0
12 Feb 2025
Advancing the Understanding of Fixed Point Iterations in Deep Neural
  Networks: A Detailed Analytical Study
Advancing the Understanding of Fixed Point Iterations in Deep Neural Networks: A Detailed Analytical Study
Yekun Ke
Xiaoyu Li
Yingyu Liang
Zhenmei Shi
Zhao-quan Song
53
3
0
15 Oct 2024
1