Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.03575
Cited By
Analysis of learning a flow-based generative model from limited sample complexity
5 October 2023
Hugo Cui
Florent Krzakala
Eric Vanden-Eijnden
Lenka Zdeborová
DRL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Analysis of learning a flow-based generative model from limited sample complexity"
7 / 7 papers shown
Title
Understanding Classifier-Free Guidance: High-Dimensional Theory and Non-Linear Generalizations
Krunoslav Lehman Pavasovic
Jakob Verbeek
Giulio Biroli
Marc Mézard
59
0
0
11 Feb 2025
A Sharp Convergence Theory for The Probability Flow ODEs of Diffusion Models
Gen Li
Yuting Wei
Yuejie Chi
Yuxin Chen
DiffM
33
21
0
05 Aug 2024
U-Nets as Belief Propagation: Efficient Classification, Denoising, and Diffusion in Generative Hierarchical Models
Song Mei
3DV
AI4CE
DiffM
34
11
0
29 Apr 2024
Stochastic Interpolants: A Unifying Framework for Flows and Diffusions
M. S. Albergo
Nicholas M. Boffi
Eric Vanden-Eijnden
DiffM
244
261
0
15 Mar 2023
Diffusion Models are Minimax Optimal Distribution Estimators
Kazusato Oko
Shunta Akiyama
Taiji Suzuki
DiffM
64
85
0
03 Mar 2023
Convergence of score-based generative modeling for general data distributions
Holden Lee
Jianfeng Lu
Yixin Tan
DiffM
177
128
0
26 Sep 2022
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
Sitan Chen
Sinho Chewi
Jungshian Li
Yuanzhi Li
Adil Salim
Anru R. Zhang
DiffM
123
245
0
22 Sep 2022
1