Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.08378
Cited By
Physics Informed Distillation for Diffusion Models
13 November 2024
Joshua Tian Jin Tee
Kang Zhang
Hee Suk Yoon
Dhananjaya N. Gowda
Chanwoo Kim
Chang-Dong Yoo
DiffM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Physics Informed Distillation for Diffusion Models"
4 / 4 papers shown
Title
Label-Efficient Semantic Segmentation with Diffusion Models
Dmitry Baranchuk
Ivan Rubachev
A. Voynov
Valentin Khrulkov
Artem Babenko
DiffM
VLM
195
513
0
06 Dec 2021
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
303
5,773
0
29 Apr 2021
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
184
258
0
07 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,367
0
09 Mar 2020
1