ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.01812
15
1

The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold

5 September 2022
Fang Bai
Adrien Bartoli
ArXiv (abs)PDFHTML
Abstract

We give an effective solution to the regularized optimization problem g(x)+h(x)g (\boldsymbol{x}) + h (\boldsymbol{x})g(x)+h(x), where x\boldsymbol{x}x is constrained on the unit sphere ∥x∥2=1\Vert \boldsymbol{x} \Vert_2 = 1∥x∥2​=1. Here g(⋅)g (\cdot)g(⋅) is a smooth cost with Lipschitz continuous gradient within the unit ball {x:∥x∥2≤1}\{\boldsymbol{x} : \Vert \boldsymbol{x} \Vert_2 \le 1 \}{x:∥x∥2​≤1} whereas h(⋅)h (\cdot)h(⋅) is typically non-smooth but convex and absolutely homogeneous, \textit{e.g.,}~norm regularizers and their combinations. Our solution is based on the Riemannian proximal gradient, using an idea we call \textit{proxy step-size} -- a scalar variable which we prove is monotone with respect to the actual step-size within an interval. The proxy step-size exists ubiquitously for convex and absolutely homogeneous h(⋅)h(\cdot)h(⋅), and decides the actual step-size and the tangent update in closed-form, thus the complete proximal gradient iteration. Based on these insights, we design a Riemannian proximal gradient method using the proxy step-size. We prove that our method converges to a critical point, guided by a line-search technique based on the g(⋅)g(\cdot)g(⋅) cost only. The proposed method can be implemented in a couple of lines of code. We show its usefulness by applying nuclear norm, ℓ1\ell_1ℓ1​ norm, and nuclear-spectral norm regularization to three classical computer vision problems. The improvements are consistent and backed by numerical experiments.

View on arXiv
Comments on this paper