ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.00306
  4. Cited By
Dimension-free convergence rates for gradient Langevin dynamics in RKHS
v1v2 (latest)

Dimension-free convergence rates for gradient Langevin dynamics in RKHS

29 February 2020
Boris Muzellec
Kanji Sato
Mathurin Massias
Taiji Suzuki
ArXiv (abs)PDFHTML

Papers citing "Dimension-free convergence rates for gradient Langevin dynamics in RKHS"

2 / 2 papers shown
Title
Deep Stochastic Mechanics
Deep Stochastic Mechanics
Elena Orlova
Aleksei Ustimenko
Ruoxi Jiang
Peter Y. Lu
Rebecca Willett
DiffM
85
0
0
31 May 2023
Benefit of deep learning with non-convex noisy gradient descent:
  Provable excess risk bound and superiority to kernel methods
Benefit of deep learning with non-convex noisy gradient descent: Provable excess risk bound and superiority to kernel methods
Taiji Suzuki
Shunta Akiyama
MLT
65
12
0
06 Dec 2020
1