ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.09526
  4. Cited By
Stochastic Subspace Cubic Newton Method

Stochastic Subspace Cubic Newton Method

International Conference on Machine Learning (ICML), 2020
21 February 2020
Filip Hanzely
N. Doikov
Peter Richtárik
Y. Nesterov
ArXiv (abs)PDFHTML

Papers citing "Stochastic Subspace Cubic Newton Method"

20 / 20 papers shown
Simple Stepsize for Quasi-Newton Methods with Global Convergence Guarantees
Simple Stepsize for Quasi-Newton Methods with Global Convergence Guarantees
A. Agafonov
Vladislav Ryspayev
Samuel Horváth
Alexander V. Gasnikov
Martin Takáč
Slavomír Hanzely
142
1
0
27 Aug 2025
Turbocharging Gaussian Process Inference with Approximate Sketch-and-Project
Turbocharging Gaussian Process Inference with Approximate Sketch-and-Project
Pratik Rathore
Zachary Frangella
Sachin Garg
Shaghayegh Fazliani
Michał Dereziński
Madeleine Udell
435
3
0
19 May 2025
Improving Stochastic Cubic Newton with Momentum
Improving Stochastic Cubic Newton with MomentumInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2024
El Mahdi Chayti
N. Doikov
Martin Jaggi
ODL
315
6
0
25 Oct 2024
Cubic regularized subspace Newton for non-convex optimization
Cubic regularized subspace Newton for non-convex optimization
Jim Zhao
Aurelien Lucchi
N. Doikov
355
7
0
24 Jun 2024
Recent and Upcoming Developments in Randomized Numerical Linear Algebra
  for Machine Learning
Recent and Upcoming Developments in Randomized Numerical Linear Algebra for Machine Learning
Michał Dereziński
Michael W. Mahoney
327
20
0
17 Jun 2024
Fine-grained Analysis and Faster Algorithms for Iteratively Solving Linear Systems
Fine-grained Analysis and Faster Algorithms for Iteratively Solving Linear Systems
Michal Dereziñski
Daniel LeJeune
Deanna Needell
E. Rebrova
398
13
0
09 May 2024
Krylov Cubic Regularized Newton: A Subspace Second-Order Method with
  Dimension-Free Convergence Rate
Krylov Cubic Regularized Newton: A Subspace Second-Order Method with Dimension-Free Convergence Rate
Ruichen Jiang
Parameswaran Raman
Shoham Sabach
Aryan Mokhtari
Mingyi Hong
Volkan Cevher
267
4
0
05 Jan 2024
Towards a Better Theoretical Understanding of Independent Subnetwork
  Training
Towards a Better Theoretical Understanding of Independent Subnetwork TrainingInternational Conference on Machine Learning (ICML), 2023
Egor Shulgin
Peter Richtárik
AI4CE
397
8
0
28 Jun 2023
Constrained Optimization via Exact Augmented Lagrangian and Randomized
  Iterative Sketching
Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative SketchingInternational Conference on Machine Learning (ICML), 2023
Ilgee Hong
Sen Na
Michael W. Mahoney
Mladen Kolar
216
7
0
28 May 2023
Sketch-and-Project Meets Newton Method: Global $\mathcal O(k^{-2})$
  Convergence with Low-Rank Updates
Sketch-and-Project Meets Newton Method: Global O(k−2)\mathcal O(k^{-2})O(k−2) Convergence with Low-Rank Updates
Slavomír Hanzely
330
7
0
22 May 2023
Sharp Analysis of Sketch-and-Project Methods via a Connection to
  Randomized Singular Value Decomposition
Sharp Analysis of Sketch-and-Project Methods via a Connection to Randomized Singular Value DecompositionSIAM Journal on Mathematics of Data Science (SIMODS), 2022
Michal Derezinski
E. Rebrova
348
26
0
20 Aug 2022
Super-Universal Regularized Newton Method
Super-Universal Regularized Newton MethodSIAM Journal on Optimization (SIAM J. Optim.), 2022
N. Doikov
Konstantin Mishchenko
Y. Nesterov
182
47
0
11 Aug 2022
Distributed Newton-Type Methods with Communication Compression and
  Bernoulli Aggregation
Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation
Rustem Islamov
Xun Qian
Slavomír Hanzely
M. Safaryan
Peter Richtárik
288
21
0
07 Jun 2022
Augmented Newton Method for Optimization: Global Linear Rate and
  Momentum Interpretation
Augmented Newton Method for Optimization: Global Linear Rate and Momentum Interpretation
M. Morshed
ODL
206
1
0
23 May 2022
Regularized Newton Method with Global $O(1/k^2)$ Convergence
Regularized Newton Method with Global O(1/k2)O(1/k^2)O(1/k2) Convergence
Konstantin Mishchenko
387
53
0
03 Dec 2021
Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave
  Minimax Optimization
Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave Minimax OptimizationNeural Information Processing Systems (NeurIPS), 2021
Luo Luo
Yujun Li
Cheng Chen
379
18
0
10 Oct 2021
Curvature-Aware Derivative-Free Optimization
Curvature-Aware Derivative-Free OptimizationJournal of Scientific Computing (J. Sci. Comput.), 2021
Bumsu Kim
HanQin Cai
Daniel McKenzie
W. Yin
ODL
384
14
0
27 Sep 2021
Global optimization using random embeddings
Global optimization using random embeddingsMathematical programming (Math. Program.), 2021
C. Cartis
E. Massart
Adilet Otemissov
279
12
0
26 Jul 2021
Distributed Second Order Methods with Fast Rates and Compressed
  Communication
Distributed Second Order Methods with Fast Rates and Compressed CommunicationInternational Conference on Machine Learning (ICML), 2021
Rustem Islamov
Xun Qian
Peter Richtárik
243
56
0
14 Feb 2021
Optimization for Supervised Machine Learning: Randomized Algorithms for
  Data and Parameters
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
Filip Hanzely
232
0
0
26 Aug 2020
1
Page 1 of 1