ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.15260
11
14

Efficient Convex Optimization Requires Superlinear Memory

29 March 2022
A. Marsden
Vatsal Sharan
Aaron Sidford
Gregory Valiant
ArXivPDFHTML
Abstract

We show that any memory-constrained, first-order algorithm which minimizes ddd-dimensional, 111-Lipschitz convex functions over the unit ball to 1/poly(d)1/\mathrm{poly}(d)1/poly(d) accuracy using at most d1.25−δd^{1.25 - \delta}d1.25−δ bits of memory must make at least Ω~(d1+(4/3)δ)\tilde{\Omega}(d^{1 + (4/3)\delta})Ω~(d1+(4/3)δ) first-order queries (for any constant δ∈[0,1/4]\delta \in [0, 1/4]δ∈[0,1/4]). Consequently, the performance of such memory-constrained algorithms are a polynomial factor worse than the optimal O~(d)\tilde{O}(d)O~(d) query bound for this problem obtained by cutting plane methods that use O~(d2)\tilde{O}(d^2)O~(d2) memory. This resolves a COLT 2019 open problem of Woodworth and Srebro.

View on arXiv
Comments on this paper