ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1601.04737
  4. Cited By
Sub-Sampled Newton Methods I: Globally Convergent Algorithms

Sub-Sampled Newton Methods I: Globally Convergent Algorithms

18 January 2016
Farbod Roosta-Khorasani
Michael W. Mahoney
ArXivPDFHTML

Papers citing "Sub-Sampled Newton Methods I: Globally Convergent Algorithms"

19 / 19 papers shown
Title
ISAAC Newton: Input-based Approximate Curvature for Newton's Method
ISAAC Newton: Input-based Approximate Curvature for Newton's Method
Felix Petersen
Tobias Sutter
Christian Borgelt
Dongsung Huh
Hilde Kuehne
Yuekai Sun
Oliver Deussen
ODL
47
5
0
01 May 2023
An Efficient Nonlinear Acceleration method that Exploits Symmetry of the
  Hessian
An Efficient Nonlinear Acceleration method that Exploits Symmetry of the Hessian
Huan He
Shifan Zhao
Z. Tang
Joyce C. Ho
Y. Saad
Yuanzhe Xi
37
3
0
22 Oct 2022
On randomized sketching algorithms and the Tracy-Widom law
On randomized sketching algorithms and the Tracy-Widom law
Daniel Ahfock
W. Astle
S. Richardson
28
1
0
03 Jan 2022
Constrained and Composite Optimization via Adaptive Sampling Methods
Constrained and Composite Optimization via Adaptive Sampling Methods
Yuchen Xie
Raghu Bollapragada
R. Byrd
J. Nocedal
27
14
0
31 Dec 2020
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Z. Yao
A. Gholami
Sheng Shen
Mustafa Mustafa
Kurt Keutzer
Michael W. Mahoney
ODL
39
275
0
01 Jun 2020
Low Rank Saddle Free Newton: A Scalable Method for Stochastic Nonconvex
  Optimization
Low Rank Saddle Free Newton: A Scalable Method for Stochastic Nonconvex Optimization
Thomas O'Leary-Roseberry
Nick Alger
Omar Ghattas
ODL
42
9
0
07 Feb 2020
Convergence Analysis of Block Coordinate Algorithms with Determinantal
  Sampling
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling
Mojmír Mutný
Michal Derezinski
Andreas Krause
38
21
0
25 Oct 2019
A Hybrid Stochastic Optimization Framework for Stochastic Composite
  Nonconvex Optimization
A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Quoc Tran-Dinh
Nhan H. Pham
T. Dzung
Lam M. Nguyen
27
49
0
08 Jul 2019
OverSketched Newton: Fast Convex Optimization for Serverless Systems
OverSketched Newton: Fast Convex Optimization for Serverless Systems
Vipul Gupta
S. Kadhe
T. Courtade
Michael W. Mahoney
Kannan Ramchandran
19
33
0
21 Mar 2019
GPU Accelerated Sub-Sampled Newton's Method
GPU Accelerated Sub-Sampled Newton's Method
Sudhir B. Kylasa
Farbod Roosta-Khorasani
Michael W. Mahoney
A. Grama
ODL
26
8
0
26 Feb 2018
GIANT: Globally Improved Approximate Newton Method for Distributed
  Optimization
GIANT: Globally Improved Approximate Newton Method for Distributed Optimization
Shusen Wang
Farbod Roosta-Khorasani
Peng Xu
Michael W. Mahoney
36
127
0
11 Sep 2017
An inexact subsampled proximal Newton-type method for large-scale
  machine learning
An inexact subsampled proximal Newton-type method for large-scale machine learning
Xuanqing Liu
Cho-Jui Hsieh
Jason D. Lee
Yuekai Sun
35
15
0
28 Aug 2017
Optimization Methods for Supervised Machine Learning: From Linear Models
  to Deep Learning
Optimization Methods for Supervised Machine Learning: From Linear Models to Deep Learning
Frank E. Curtis
K. Scheinberg
39
45
0
30 Jun 2017
Statistical properties of sketching algorithms
Statistical properties of sketching algorithms
Daniel Ahfock
W. Astle
S. Richardson
29
37
0
12 Jun 2017
Large Scale Empirical Risk Minimization via Truncated Adaptive Newton
  Method
Large Scale Empirical Risk Minimization via Truncated Adaptive Newton Method
Mark Eisen
Aryan Mokhtari
Alejandro Ribeiro
35
16
0
22 May 2017
Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Tianxiao Sun
Quoc Tran-Dinh
24
60
0
14 Mar 2017
Exact and Inexact Subsampled Newton Methods for Optimization
Exact and Inexact Subsampled Newton Methods for Optimization
Raghu Bollapragada
R. Byrd
J. Nocedal
23
178
0
27 Sep 2016
Sub-Sampled Newton Methods II: Local Convergence Rates
Sub-Sampled Newton Methods II: Local Convergence Rates
Farbod Roosta-Khorasani
Michael W. Mahoney
33
84
0
18 Jan 2016
Newton-Stein Method: An optimization method for GLMs via Stein's Lemma
Newton-Stein Method: An optimization method for GLMs via Stein's Lemma
Murat A. Erdogdu
29
13
0
28 Nov 2015
1