ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.04982
  4. Cited By
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
v1v2 (latest)

Oracle Complexity of Second-Order Methods for Finite-Sum Problems

15 November 2016
Yossi Arjevani
Ohad Shamir
ArXiv (abs)PDFHTML

Papers citing "Oracle Complexity of Second-Order Methods for Finite-Sum Problems"

12 / 12 papers shown
Second-Order Information in Non-Convex Stochastic Optimization: Power
  and Limitations
Second-Order Information in Non-Convex Stochastic Optimization: Power and LimitationsAnnual Conference Computational Learning Theory (COLT), 2020
Yossi Arjevani
Y. Carmon
John C. Duchi
Dylan J. Foster
Ayush Sekhari
Karthik Sridharan
292
63
0
24 Jun 2020
Optimal Complexity in Decentralized Training
Optimal Complexity in Decentralized TrainingInternational Conference on Machine Learning (ICML), 2020
Yucheng Lu
Christopher De Sa
448
91
0
15 Jun 2020
Curvature-Exploiting Acceleration of Elastic Net Computations
Curvature-Exploiting Acceleration of Elastic Net Computations
Vien V. Mai
M. Johansson
104
0
0
24 Jan 2019
Newton-MR: Inexact Newton Method With Minimum Residual Sub-problem
  Solver
Newton-MR: Inexact Newton Method With Minimum Residual Sub-problem Solver
Fred Roosta
Yang Liu
Peng Xu
Michael W. Mahoney
423
17
0
30 Sep 2018
Statistical Inference for the Population Landscape via Moment Adjusted
  Stochastic Gradients
Statistical Inference for the Population Landscape via Moment Adjusted Stochastic Gradients
Tengyuan Liang
Weijie Su
137
21
0
20 Dec 2017
Lower Bounds for Higher-Order Convex Optimization
Lower Bounds for Higher-Order Convex OptimizationAnnual Conference Computational Learning Theory (COLT), 2017
Naman Agarwal
Elad Hazan
117
48
0
27 Oct 2017
Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian
  Information
Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information
Peng Xu
Farbod Roosta-Khorasani
Michael W. Mahoney
561
220
0
23 Aug 2017
Improved Optimization of Finite Sums with Minibatch Stochastic Variance
  Reduced Proximal Iterations
Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations
Jialei Wang
Tong Zhang
251
12
0
21 Jun 2017
Limitations on Variance-Reduction and Acceleration Schemes for Finite
  Sum Optimization
Limitations on Variance-Reduction and Acceleration Schemes for Finite Sum OptimizationNeural Information Processing Systems (NeurIPS), 2017
Yossi Arjevani
150
12
0
06 Jun 2017
An Investigation of Newton-Sketch and Subsampled Newton Methods
An Investigation of Newton-Sketch and Subsampled Newton Methods
A. Berahas
Raghu Bollapragada
J. Nocedal
252
118
0
17 May 2017
On the Gap Between Strict-Saddles and True Convexity: An Omega(log d)
  Lower Bound for Eigenvector Approximation
On the Gap Between Strict-Saddles and True Convexity: An Omega(log d) Lower Bound for Eigenvector Approximation
Max Simchowitz
A. Alaoui
Benjamin Recht
146
13
0
14 Apr 2017
On the Fine-Grained Complexity of Empirical Risk Minimization: Kernel
  Methods and Neural Networks
On the Fine-Grained Complexity of Empirical Risk Minimization: Kernel Methods and Neural Networks
A. Backurs
Piotr Indyk
Ludwig Schmidt
218
43
0
10 Apr 2017
1