ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.12534
21
6

Memory-Query Tradeoffs for Randomized Convex Optimization

21 June 2023
X. Chen
Binghui Peng
ArXivPDFHTML
Abstract

We show that any randomized first-order algorithm which minimizes a ddd-dimensional, 111-Lipschitz convex function over the unit ball must either use Ω(d2−δ)\Omega(d^{2-\delta})Ω(d2−δ) bits of memory or make Ω(d1+δ/6−o(1))\Omega(d^{1+\delta/6-o(1)})Ω(d1+δ/6−o(1)) queries, for any constant δ∈(0,1)\delta\in (0,1)δ∈(0,1) and when the precision ϵ\epsilonϵ is quasipolynomially small in ddd. Our result implies that cutting plane methods, which use O~(d2)\tilde{O}(d^2)O~(d2) bits of memory and O~(d)\tilde{O}(d)O~(d) queries, are Pareto-optimal among randomized first-order algorithms, and quadratic memory is required to achieve optimal query complexity for convex optimization.

View on arXiv
Comments on this paper