ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.04344
  4. Cited By
LEDiT: Your Length-Extrapolatable Diffusion Transformer without Positional Encoding
v1v2v3 (latest)

LEDiT: Your Length-Extrapolatable Diffusion Transformer without Positional Encoding

6 March 2025
Shen Zhang
Yaning Tan
Yaning Tan
Zhaowei Chen
Linze Li
Ge Wu
Yuhao Chen
Shuheng Li
Zhenyu Zhao
Caihua Chen
Jiajun Liang
Yao Tang
ArXiv (abs)PDFHTMLGithub (24342★)

Papers citing "LEDiT: Your Length-Extrapolatable Diffusion Transformer without Positional Encoding"

1 / 1 papers shown
Title
Representation Entanglement for Generation: Training Diffusion Transformers Is Much Easier Than You Think
Representation Entanglement for Generation: Training Diffusion Transformers Is Much Easier Than You Think
Ge Wu
Shen Zhang
Ruijing Shi
Shanghua Gao
Zhenyuan Chen
...
Hongcheng Gao
Yao Tang
Jian Yang
Ming-Ming Cheng
X. Li
216
15
0
02 Jul 2025
1