ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.00285
  4. Cited By
Stochastic Gradient Methods with Preconditioned Updates

Stochastic Gradient Methods with Preconditioned Updates

1 June 2022
Abdurakhmon Sadiev
Aleksandr Beznosikov
Abdulla Jasem Almansoori
Dmitry Kamzolov
R. Tappenden
Martin Takáč
    ODL
ArXivPDFHTML

Papers citing "Stochastic Gradient Methods with Preconditioned Updates"

7 / 7 papers shown
Title
Trial and Trust: Addressing Byzantine Attacks with Comprehensive Defense Strategy
Trial and Trust: Addressing Byzantine Attacks with Comprehensive Defense Strategy
Gleb Molodtsov
Daniil Medyakov
Sergey Skorik
Nikolas Khachaturov
Shahane Tigranyan
Vladimir Aletov
A. Avetisyan
Martin Takáč
Aleksandr Beznosikov
AAML
35
0
0
12 May 2025
Just a Simple Transformation is Enough for Data Protection in Vertical
  Federated Learning
Just a Simple Transformation is Enough for Data Protection in Vertical Federated Learning
Andrei Semenov
Philip Zmushko
Alexander Pichugin
Aleksandr Beznosikov
88
0
0
16 Dec 2024
Local Methods with Adaptivity via Scaling
Local Methods with Adaptivity via Scaling
Saveliy Chezhegov
Sergey Skorik
Nikolas Khachaturov
Danil Shalagin
A. Avetisyan
Aleksandr Beznosikov
Martin Takáč
Yaroslav Kholodov
Alexander Gasnikov
53
2
0
02 Jun 2024
Stochastic Gradient Descent with Preconditioned Polyak Step-size
Stochastic Gradient Descent with Preconditioned Polyak Step-size
Farshed Abdukhakimov
Chulu Xiang
Dmitry Kamzolov
Martin Takáč
26
5
0
03 Oct 2023
AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods
AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods
Zheng Shi
Abdurakhmon Sadiev
Nicolas Loizou
Peter Richtárik
Martin Takávc
ODL
32
13
0
19 Feb 2021
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
84
736
0
19 Mar 2014
1