ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1601.04738
  4. Cited By
Sub-Sampled Newton Methods II: Local Convergence Rates

Sub-Sampled Newton Methods II: Local Convergence Rates

18 January 2016
Farbod Roosta-Khorasani
Michael W. Mahoney
ArXivPDFHTML

Papers citing "Sub-Sampled Newton Methods II: Local Convergence Rates"

14 / 14 papers shown
Title
SCORE: Approximating Curvature Information under Self-Concordant
  Regularization
SCORE: Approximating Curvature Information under Self-Concordant Regularization
Adeyemi Damilare Adeoye
Alberto Bemporad
10
4
0
14 Dec 2021
Iterative Teaching by Label Synthesis
Iterative Teaching by Label Synthesis
Weiyang Liu
Zhen Liu
Hanchen Wang
Liam Paull
Bernhard Schölkopf
Adrian Weller
35
16
0
27 Oct 2021
Fractal Structure and Generalization Properties of Stochastic
  Optimization Algorithms
Fractal Structure and Generalization Properties of Stochastic Optimization Algorithms
A. Camuto
George Deligiannidis
Murat A. Erdogdu
Mert Gurbuzbalaban
Umut cSimcsekli
Lingjiong Zhu
25
29
0
09 Jun 2021
Constrained and Composite Optimization via Adaptive Sampling Methods
Constrained and Composite Optimization via Adaptive Sampling Methods
Yuchen Xie
Raghu Bollapragada
R. Byrd
J. Nocedal
16
14
0
31 Dec 2020
Learning Rates as a Function of Batch Size: A Random Matrix Theory
  Approach to Neural Network Training
Learning Rates as a Function of Batch Size: A Random Matrix Theory Approach to Neural Network Training
Diego Granziol
S. Zohren
Stephen J. Roberts
ODL
29
48
0
16 Jun 2020
Convergence Analysis of Block Coordinate Algorithms with Determinantal
  Sampling
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling
Mojmír Mutný
Michal Derezinski
Andreas Krause
25
20
0
25 Oct 2019
Combining Stochastic Adaptive Cubic Regularization with Negative
  Curvature for Nonconvex Optimization
Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization
Seonho Park
Seung Hyun Jung
P. Pardalos
ODL
16
15
0
27 Jun 2019
GPU Accelerated Sub-Sampled Newton's Method
GPU Accelerated Sub-Sampled Newton's Method
Sudhir B. Kylasa
Farbod Roosta-Khorasani
Michael W. Mahoney
A. Grama
ODL
16
8
0
26 Feb 2018
GIANT: Globally Improved Approximate Newton Method for Distributed
  Optimization
GIANT: Globally Improved Approximate Newton Method for Distributed Optimization
Shusen Wang
Farbod Roosta-Khorasani
Peng Xu
Michael W. Mahoney
18
127
0
11 Sep 2017
An inexact subsampled proximal Newton-type method for large-scale
  machine learning
An inexact subsampled proximal Newton-type method for large-scale machine learning
Xuanqing Liu
Cho-Jui Hsieh
J. Lee
Yuekai Sun
19
15
0
28 Aug 2017
Optimization Methods for Supervised Machine Learning: From Linear Models
  to Deep Learning
Optimization Methods for Supervised Machine Learning: From Linear Models to Deep Learning
Frank E. Curtis
K. Scheinberg
26
45
0
30 Jun 2017
Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Tianxiao Sun
Quoc Tran-Dinh
8
60
0
14 Mar 2017
Exact and Inexact Subsampled Newton Methods for Optimization
Exact and Inexact Subsampled Newton Methods for Optimization
Raghu Bollapragada
R. Byrd
J. Nocedal
15
176
0
27 Sep 2016
Newton-Stein Method: An optimization method for GLMs via Stein's Lemma
Newton-Stein Method: An optimization method for GLMs via Stein's Lemma
Murat A. Erdogdu
16
13
0
28 Nov 2015
1