ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.04248
  4. Cited By
TRACT: Denoising Diffusion Models with Transitive Closure
  Time-Distillation

TRACT: Denoising Diffusion Models with Transitive Closure Time-Distillation

7 March 2023
David Berthelot
Arnaud Autef
Jierui Lin
Dian Ang Yap
Shuangfei Zhai
Siyuan Hu
Daniel Zheng
Walter Talbot
Eric Gu
    DiffM
ArXivPDFHTML

Papers citing "TRACT: Denoising Diffusion Models with Transitive Closure Time-Distillation"

14 / 14 papers shown
Title
Integration Flow Models
Integration Flow Models
Jingjing Wang
Dan Zhang
Joshua Luo
Yin Yang
Feng Luo
60
0
0
28 Apr 2025
Fast Autoregressive Models for Continuous Latent Generation
Fast Autoregressive Models for Continuous Latent Generation
Tiankai Hang
Jianmin Bao
Fangyun Wei
Dong Chen
DiffM
70
0
0
24 Apr 2025
Improved Training Technique for Latent Consistency Models
Improved Training Technique for Latent Consistency Models
Quan Dao
Khanh Doan
Di Liu
Trung Le
Dimitris N. Metaxas
60
3
0
03 Feb 2025
Truncated Consistency Models
Truncated Consistency Models
Sangyun Lee
Yilun Xu
Tomas Geffner
Giulia Fanti
Karsten Kreis
Arash Vahdat
Weili Nie
46
3
0
18 Oct 2024
Maximum Entropy Inverse Reinforcement Learning of Diffusion Models with
  Energy-Based Models
Maximum Entropy Inverse Reinforcement Learning of Diffusion Models with Energy-Based Models
Sangwoong Yoon
Himchan Hwang
Dohyun Kwon
Yung-Kyun Noh
Frank C. Park
19
2
0
30 Jun 2024
Improving the Training of Rectified Flows
Improving the Training of Rectified Flows
Sangyun Lee
Zinan Lin
Giulia Fanti
34
19
0
30 May 2024
PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator
PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator
Hanshu Yan
Xingchao Liu
Jiachun Pan
Jun Hao Liew
Qiang Liu
Jiashi Feng
27
40
0
13 May 2024
Linear Combination of Saved Checkpoints Makes Consistency and Diffusion Models Better
Linear Combination of Saved Checkpoints Makes Consistency and Diffusion Models Better
En-hao Liu
Junyi Zhu
Zinan Lin
Xuefei Ning
Shuaiqi Wang
...
Sergey Yekhanin
Guohao Dai
Huazhong Yang
Yu-Xiang Wang
Yu Wang
MoMe
55
4
0
02 Apr 2024
Continual Learning of Diffusion Models with Generative Distillation
Continual Learning of Diffusion Models with Generative Distillation
Sergi Masip
Pau Rodriguez
Tinne Tuytelaars
Gido M. van de Ven
VLM
DiffM
13
7
0
23 Nov 2023
SDXL: Improving Latent Diffusion Models for High-Resolution Image
  Synthesis
SDXL: Improving Latent Diffusion Models for High-Resolution Image Synthesis
Dustin Podell
Zion English
Kyle Lacey
A. Blattmann
Tim Dockhorn
Jonas Muller
Joe Penna
Robin Rombach
11
2,109
0
04 Jul 2023
f-DM: A Multi-stage Diffusion Model via Progressive Signal
  Transformation
f-DM: A Multi-stage Diffusion Model via Progressive Signal Transformation
Jiatao Gu
Shuangfei Zhai
Yizhe Zhang
Miguel Angel Bautista
J. Susskind
DiffM
39
26
0
10 Oct 2022
Diffusion-LM Improves Controllable Text Generation
Diffusion-LM Improves Controllable Text Generation
Xiang Lisa Li
John Thickstun
Ishaan Gulrajani
Percy Liang
Tatsunori B. Hashimoto
AI4CE
171
768
0
27 May 2022
Argmax Flows and Multinomial Diffusion: Learning Categorical
  Distributions
Argmax Flows and Multinomial Diffusion: Learning Categorical Distributions
Emiel Hoogeboom
Didrik Nielsen
P. Jaini
Patrick Forré
Max Welling
DiffM
202
392
0
10 Feb 2021
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
184
256
0
07 Jan 2021
1