ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06915
  4. Cited By
Weighted Averaged Stochastic Gradient Descent: Asymptotic Normality and Optimality

Weighted Averaged Stochastic Gradient Descent: Asymptotic Normality and Optimality

13 July 2023
Ziyang Wei
Wanrong Zhu
W. Wu
ArXivPDFHTML

Papers citing "Weighted Averaged Stochastic Gradient Descent: Asymptotic Normality and Optimality"

5 / 5 papers shown
Title
Sharp Gaussian approximations for Decentralized Federated Learning
Sharp Gaussian approximations for Decentralized Federated Learning
Soham Bonnerjee
Sayar Karmakar
W. Wu
FedML
24
0
0
12 May 2025
Enhancing Stochastic Optimization for Statistical Efficiency Using
  ROOT-SGD with Diminishing Stepsize
Enhancing Stochastic Optimization for Statistical Efficiency Using ROOT-SGD with Diminishing Stepsize
Tong Zhang
Chris Junchi Li
36
0
0
15 Jul 2024
High Confidence Level Inference is Almost Free using Parallel Stochastic
  Optimization
High Confidence Level Inference is Almost Free using Parallel Stochastic Optimization
Wanrong Zhu
Zhipeng Lou
Ziyang Wei
W. Wu
30
1
0
17 Jan 2024
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
121
259
0
10 Dec 2012
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
99
570
0
08 Dec 2012
1