ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02702
  4. Cited By
Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

6 June 2022
Michal Derezinski
ArXivPDFHTML

Papers citing "Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches"

6 / 6 papers shown
Title
SAPPHIRE: Preconditioned Stochastic Variance Reduction for Faster Large-Scale Statistical Learning
Jingruo Sun
Zachary Frangella
Madeleine Udell
31
0
0
28 Jan 2025
Recent and Upcoming Developments in Randomized Numerical Linear Algebra
  for Machine Learning
Recent and Upcoming Developments in Randomized Numerical Linear Algebra for Machine Learning
Michał Dereziński
Michael W. Mahoney
20
5
0
17 Jun 2024
Second-order Information Promotes Mini-Batch Robustness in
  Variance-Reduced Gradients
Second-order Information Promotes Mini-Batch Robustness in Variance-Reduced Gradients
Sachin Garg
A. Berahas
Michal Dereziñski
25
1
0
23 Apr 2024
Surrogate-based Autotuning for Randomized Sketching Algorithms in Regression Problems
Surrogate-based Autotuning for Randomized Sketching Algorithms in Regression Problems
Younghyun Cho
James Demmel
Michal Derezinski
Haoyun Li
Hengrui Luo
Michael W. Mahoney
Riley Murray
27
5
0
30 Aug 2023
Sharpened Lazy Incremental Quasi-Newton Method
Sharpened Lazy Incremental Quasi-Newton Method
Aakash Lahoti
Spandan Senapati
K. Rajawat
Alec Koppel
8
2
0
26 May 2023
Newton-LESS: Sparsification without Trade-offs for the Sketched Newton
  Update
Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update
Michal Derezinski
Jonathan Lacotte
Mert Pilanci
Michael W. Mahoney
32
26
0
15 Jul 2021
1