ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.09539
96
23
v1v2v3v4v5v6v7 (latest)

Revisiting First-Order Convex Optimization Over Linear Spaces

26 March 2018
Francesco Locatello
Anant Raj
Sai Praneeth Karimireddy
Gunnar Rätsch
Bernhard Schölkopf
Sebastian U. Stich
Martin Jaggi
ArXiv (abs)PDFHTML
Abstract

Two popular examples of first-order optimization methods over linear spaces are coordinate descent and matching pursuit algorithms, with their randomized variants. While the former targets the optimization by moving along coordinates, the latter considers a generalized notion of directions. Exploiting the connection between the two algorithms, we present a unified analysis of both, providing affine invariant sublinear O(1/t)\mathcal{O}(1/t)O(1/t) rates on smooth objectives and linear convergence on strongly convex objectives. As a byproduct of our affine invariant analysis of matching pursuit, our rates for steepest coordinate descent are the tightest known. Furthermore, we show the first accelerated convergence rate O(1/t2)\mathcal{O}(1/t^2)O(1/t2) for matching pursuit on convex objectives.

View on arXiv
Comments on this paper