ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.08394
16
4

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

22 August 2019
Guangzeng Xie
Luo Luo
Zhihua Zhang
ArXivPDFHTML
Abstract

This paper studies the lower bound complexity for the optimization problem whose objective function is the average of nnn individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex case, we prove such an algorithm can not reach an ε\varepsilonε-suboptimal point in fewer than Ω((n+κn)log⁡(1/ε))\Omega((n+\sqrt{\kappa n})\log(1/\varepsilon))Ω((n+κn​)log(1/ε)) iterations, where κ\kappaκ is the condition number of the objective function. This lower bound is tighter than previous results and perfectly matches the upper bound of the existing proximal incremental first-order oracle algorithm Point-SAGA. We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into nnn groups. This construction is friendly to the analysis of proximal oracle and also could be used to general convex and average smooth cases naturally.

View on arXiv
Comments on this paper