ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.19942
73
0

A stochastic gradient descent algorithm with random search directions

25 March 2025
Eméric Gbaguidi
    ODL
ArXivPDFHTML
Abstract

Stochastic coordinate descent algorithms are efficient methods in which each iterate is obtained by fixing most coordinates at their values from the current iteration, and approximately minimizing the objective with respect to the remaining coordinates. However, this approach is usually restricted to canonical basis vectors of Rd\mathbb{R}^dRd. In this paper, we develop a new class of stochastic gradient descent algorithms with random search directions which uses the directional derivative of the gradient estimate following more general random vectors. We establish the almost sure convergence of these algorithms with decreasing step. We further investigate their central limit theorem and pay particular attention to analyze the impact of the search distributions on the asymptotic covariance matrix. We also provide non-asymptotic Lp\mathbb{L}^pLp rates of convergence.

View on arXiv
@article{gbaguidi2025_2503.19942,
  title={ A stochastic gradient descent algorithm with random search directions },
  author={ Eméric Gbaguidi },
  journal={arXiv preprint arXiv:2503.19942},
  year={ 2025 }
}
Comments on this paper