Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.10438
Cited By
AutoDiffusion: Training-Free Optimization of Time Steps and Architectures for Automated Diffusion Model Acceleration
19 September 2023
Lijiang Li
Huixia Li
Xiawu Zheng
Jie Wu
Xuefeng Xiao
Rui Wang
Min Zheng
Xin Pan
Fei Chao
R. Ji
Re-assign community
ArXiv
PDF
HTML
Papers citing
"AutoDiffusion: Training-Free Optimization of Time Steps and Architectures for Automated Diffusion Model Acceleration"
5 / 5 papers shown
Title
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
63
1
0
06 Jun 2024
RePaint: Inpainting using Denoising Diffusion Probabilistic Models
Andreas Lugmayr
Martin Danelljan
Andrés Romero
F. I. F. Richard Yu
Radu Timofte
Luc Van Gool
DiffM
179
921
0
24 Jan 2022
Palette: Image-to-Image Diffusion Models
Chitwan Saharia
William Chan
Huiwen Chang
Chris A. Lee
Jonathan Ho
Tim Salimans
David J. Fleet
Mohammad Norouzi
DiffM
VLM
288
1,153
0
10 Nov 2021
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
151
186
0
07 Jan 2021
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
265
9,997
0
01 Sep 2014
1