ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.00208
21
4

DRSOM: A Dimension Reduced Second-Order Method

30 July 2022
Chuwen Zhang
Dongdong Ge
Chang He
Bo Jiang
Yuntian Jiang
Yi-Li Ye
ArXivPDFHTML
Abstract

In this paper, we propose a Dimension-Reduced Second-Order Method (DRSOM) for convex and nonconvex (unconstrained) optimization. Under a trust-region-like framework, our method preserves the convergence of the second-order method while using only curvature information in a few directions. Consequently, the computational overhead of our method remains comparable to the first-order such as the gradient descent method. Theoretically, we show that the method has a local quadratic convergence and a global convergence rate of O(ϵ−3/2)O(\epsilon^{-3/2})O(ϵ−3/2) to satisfy the first-order and second-order conditions if the subspace satisfies a commonly adopted approximated Hessian assumption. We further show that this assumption can be removed if we perform a corrector step using a Krylov-like method periodically at the end stage of the algorithm. The applicability and performance of DRSOM are exhibited by various computational experiments, including L2−LpL_2 - L_pL2​−Lp​ minimization, CUTEst problems, and sensor network localization.

View on arXiv
Comments on this paper