Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2112.02089
Cited By
v1
v2
v3 (latest)
Regularized Newton Method with Global
O
(
1
/
k
2
)
O(1/k^2)
O
(
1/
k
2
)
Convergence
3 December 2021
Konstantin Mishchenko
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Regularized Newton Method with Global $O(1/k^2)$ Convergence"
9 / 9 papers shown
Title
Gradient Norm Regularization Second-Order Algorithms for Solving Nonconvex-Strongly Concave Minimax Problems
Jun-Lin Wang
Zi Xu
131
1
0
24 Nov 2024
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method
N. Doikov
71
7
0
28 Aug 2023
Rethinking Gauss-Newton for learning over-parameterized models
Michael Arbel
Romain Menegaux
Pierre Wolinski
AI4CE
60
6
0
06 Feb 2023
Second-order optimization with lazy Hessians
N. Doikov
El Mahdi Chayti
Martin Jaggi
62
17
0
01 Dec 2022
Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods
Kimon Antonakopoulos
Ali Kavis
Volkan Cevher
ODL
86
12
0
03 Nov 2022
Super-Universal Regularized Newton Method
N. Doikov
Konstantin Mishchenko
Y. Nesterov
51
32
0
11 Aug 2022
FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning
Anis Elgabli
Chaouki Ben Issaid
Amrit Singh Bedi
K. Rajawat
M. Bennis
Vaneet Aggarwal
FedML
73
34
0
17 Jun 2022
The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization
D. Kovalev
Alexander Gasnikov
76
31
0
19 May 2022
Newton-MR: Inexact Newton Method With Minimum Residual Sub-problem Solver
Fred Roosta
Yang Liu
Peng Xu
Michael W. Mahoney
43
15
0
30 Sep 2018
1