ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.14742
  4. Cited By
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of
  Newton Method

Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

28 August 2023
N. Doikov
ArXivPDFHTML

Papers citing "Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method"

3 / 3 papers shown
Title
SAPPHIRE: Preconditioned Stochastic Variance Reduction for Faster Large-Scale Statistical Learning
Jingruo Sun
Zachary Frangella
Madeleine Udell
29
0
0
28 Jan 2025
Unified Convergence Theory of Stochastic and Variance-Reduced Cubic
  Newton Methods
Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods
El Mahdi Chayti
N. Doikov
Martin Jaggi
ODL
11
5
0
23 Feb 2023
The First Optimal Acceleration of High-Order Methods in Smooth Convex
  Optimization
The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization
D. Kovalev
Alexander Gasnikov
48
29
0
19 May 2022
1