ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.15001
17
3

Convergence guarantees for forward gradient descent in the linear regression model

26 September 2023
Thijs Bos
Johannes Schmidt-Hieber
ArXivPDFHTML
Abstract

Renewed interest in the relationship between artificial and biological neural networks motivates the study of gradient-free methods. Considering the linear regression model with random design, we theoretically analyze in this work the biologically motivated (weight-perturbed) forward gradient scheme that is based on random linear combination of the gradient. If d denotes the number of parameters and k the number of samples, we prove that the mean squared error of this method converges for k≳d2log⁡(d)k\gtrsim d^2\log(d)k≳d2log(d) with rate d2log⁡(d)/k.d^2\log(d)/k.d2log(d)/k. Compared to the dimension dependence d for stochastic gradient descent, an additional factor dlog⁡(d)d\log(d)dlog(d) occurs.

View on arXiv
Comments on this paper