ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.01043
  4. Cited By
Distribution learning via neural differential equations: a nonparametric
  statistical perspective

Distribution learning via neural differential equations: a nonparametric statistical perspective

3 September 2023
Youssef Marzouk
Zhi Ren
Sven Wang
Jakob Zech
ArXivPDFHTML

Papers citing "Distribution learning via neural differential equations: a nonparametric statistical perspective"

14 / 14 papers shown
Title
On the minimax optimality of Flow Matching through the connection to kernel density estimation
On the minimax optimality of Flow Matching through the connection to kernel density estimation
Lea Kunkel
Mathias Trabs
29
0
0
17 Apr 2025
A friendly introduction to triangular transport
A friendly introduction to triangular transport
M. Ramgraber
Daniel Sharp
M. Le Provost
Youssef Marzouk
58
0
0
27 Mar 2025
Numerical and statistical analysis of NeuralODE with Runge-Kutta time integration
Emily C. Ehrhardt
Hanno Gottschalk
Tobias Riedlinger
39
0
0
13 Mar 2025
Local Flow Matching Generative Models
Local Flow Matching Generative Models
Chen Xu
Xiuyuan Cheng
Yao Xie
39
0
0
03 Jan 2025
Kernel Approximation of Fisher-Rao Gradient Flows
Kernel Approximation of Fisher-Rao Gradient Flows
Jia Jie Zhu
Alexander Mielke
44
5
0
27 Oct 2024
Diffeomorphic Measure Matching with Kernels for Generative Modeling
Diffeomorphic Measure Matching with Kernels for Generative Modeling
Biraj Pandey
Bamdad Hosseini
Pau Batlle
H. Owhadi
15
3
0
12 Feb 2024
Density estimation using the perceptron
Density estimation using the perceptron
P. R. Gerber
Tianze Jiang
Yury Polyanskiy
Rui Sun
19
0
0
29 Dec 2023
Gaussian Interpolation Flows
Gaussian Interpolation Flows
Yuan Gao
Jianxia Huang
Yuling Jiao
AI4CE
16
2
0
20 Nov 2023
Convergence of flow-based generative models via proximal gradient
  descent in Wasserstein space
Convergence of flow-based generative models via proximal gradient descent in Wasserstein space
Xiuyuan Cheng
Jianfeng Lu
Yixin Tan
Yao Xie
96
15
0
26 Oct 2023
Stochastic Interpolants: A Unifying Framework for Flows and Diffusions
Stochastic Interpolants: A Unifying Framework for Flows and Diffusions
M. S. Albergo
Nicholas M. Boffi
Eric Vanden-Eijnden
DiffM
244
261
0
15 Mar 2023
Diffusion Models are Minimax Optimal Distribution Estimators
Diffusion Models are Minimax Optimal Distribution Estimators
Kazusato Oko
Shunta Akiyama
Taiji Suzuki
DiffM
61
84
0
03 Mar 2023
Diffusion Models: A Comprehensive Survey of Methods and Applications
Diffusion Models: A Comprehensive Survey of Methods and Applications
Ling Yang
Zhilong Zhang
Yingxia Shao
Shenda Hong
Runsheng Xu
Yue Zhao
Wentao Zhang
Bin Cui
Ming-Hsuan Yang
DiffM
MedIm
224
1,296
0
02 Sep 2022
Convex Potential Flows: Universal Probability Distributions with Optimal
  Transport and Convex Optimization
Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization
Chin-Wei Huang
Ricky T. Q. Chen
Christos Tsirigotis
Aaron Courville
OT
112
95
0
10 Dec 2020
Consistency of Bayesian inference with Gaussian process priors in an
  elliptic inverse problem
Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem
M. Giordano
Richard Nickl
26
57
0
16 Oct 2019
1