ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.01908
38
0

A Full Adagrad algorithm with O(Nd) operations

3 May 2024
Antoine Godichon-Baggioni
Wei Lu
Bruno Portier
    ODL
ArXivPDFHTML
Abstract

A novel approach is given to overcome the computational challenges of the full-matrix Adaptive Gradient algorithm (Full AdaGrad) in stochastic optimization. By developing a recursive method that estimates the inverse of the square root of the covariance of the gradient, alongside a streaming variant for parameter updates, the study offers efficient and practical algorithms for large-scale applications. This innovative strategy significantly reduces the complexity and resource demands typically associated with full-matrix methods, enabling more effective optimization processes. Moreover, the convergence rates of the proposed estimators and their asymptotic efficiency are given. Their effectiveness is demonstrated through numerical studies.

View on arXiv
@article{godichon-baggioni2025_2405.01908,
  title={ A Full Adagrad algorithm with O(Nd) operations },
  author={ Antoine Godichon-Baggioni and Wei Lu and Bruno Portier },
  journal={arXiv preprint arXiv:2405.01908},
  year={ 2025 }
}
Comments on this paper