ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.03503
68
10
v1v2v3v4v5 (latest)

Convergence of block coordinate descent with diminishing radius for nonconvex optimization

7 December 2020
Hanbaek Lyu
ArXiv (abs)PDFHTML
Abstract

Block coordinate descent (BCD), also known as nonlinear Gauss-Seidel, is a simple iterative algorithm for nonconvex optimization that sequentially minimizes the objective function in each block coordinate while the other coordinates are held fixed. It is known that block-wise convexity of the objective is not enough to guarantee convergence of BCD to the stationary points and some additional regularity condition is needed. In this work, we provide a simple modification of BCD that has guaranteed global convergence to the stationary points for block-wise convex objective function without additional conditions. Our idea is to restrict the parameter search within a diminishing radius to promote stability of iterates, and then to show that such auxiliary constraint vanishes in the limit. As an application, we provide a modified alternating least squares algorithm for nonnegative CP tensor factorization that is guaranteed to converge to the stationary points of reconstruction error function. We also provide some experimental validation of our result.

View on arXiv
Comments on this paper