ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.03000
  4. Cited By
On Projected Stochastic Gradient Descent Algorithm with Weighted
  Averaging for Least Squares Regression

On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression

9 June 2016
Kobi Cohen
A. Nedić
R. Srikant
ArXivPDFHTML

Papers citing "On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression"

5 / 5 papers shown
Title
Online estimation of the inverse of the Hessian for stochastic optimization with application to universal stochastic Newton algorithms
Online estimation of the inverse of the Hessian for stochastic optimization with application to universal stochastic Newton algorithms
Antoine Godichon-Baggioni
Wei Lu
Bruno Portier
49
1
0
15 Jan 2024
A Robust Gradient Tracking Method for Distributed Optimization over
  Directed Networks
A Robust Gradient Tracking Method for Distributed Optimization over Directed Networks
Shi Pu
29
38
0
31 Mar 2020
A Distributed Stochastic Gradient Tracking Method
A Distributed Stochastic Gradient Tracking Method
Shi Pu
A. Nedić
29
60
0
21 Mar 2018
Lp and almost sure rates of convergence of averaged stochastic gradient
  algorithms: locally strongly convex objective
Lp and almost sure rates of convergence of averaged stochastic gradient algorithms: locally strongly convex objective
Antoine Godichon-Baggioni
9
17
0
18 Sep 2016
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark Schmidt
Francis R. Bach
128
259
0
10 Dec 2012
1