ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.06637
  4. Cited By
Continuous-time stochastic gradient descent for optimizing over the
  stationary distribution of stochastic differential equations

Continuous-time stochastic gradient descent for optimizing over the stationary distribution of stochastic differential equations

14 February 2022
Ziheng Wang
Justin A. Sirignano
ArXivPDFHTML

Papers citing "Continuous-time stochastic gradient descent for optimizing over the stationary distribution of stochastic differential equations"

2 / 2 papers shown
Title
Implicit Diffusion: Efficient Optimization through Stochastic Sampling
Implicit Diffusion: Efficient Optimization through Stochastic Sampling
Pierre Marion
Anna Korba
Peter Bartlett
Mathieu Blondel
Valentin De Bortoli
Arnaud Doucet
Felipe Llinares-López
Courtney Paquette
Quentin Berthet
76
11
0
08 Feb 2024
A Forward Propagation Algorithm for Online Optimization of Nonlinear
  Stochastic Differential Equations
A Forward Propagation Algorithm for Online Optimization of Nonlinear Stochastic Differential Equations
Ziheng Wang
Justin A. Sirignano
54
3
0
10 Jul 2022
1