ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.13161
27
8

Global Convergence of Model Function Based Bregman Proximal Minimization Algorithms

24 December 2020
Mahesh Chandra Mukkamala
M. Fadili
Peter Ochs
ArXivPDFHTML
Abstract

Lipschitz continuity of the gradient mapping of a continuously differentiable function plays a crucial role in designing various optimization algorithms. However, many functions arising in practical applications such as low rank matrix factorization or deep neural network problems do not have a Lipschitz continuous gradient. This led to the development of a generalized notion known as the LLL-smad property, which is based on generalized proximity measures called Bregman distances. However, the LLL-smad property cannot handle nonsmooth functions, for example, simple nonsmooth functions like \absx4−1\abs{x^4-1}\absx4−1 and also many practical composite problems are out of scope. We fix this issue by proposing the MAP property, which generalizes the LLL-smad property and is also valid for a large class of nonconvex nonsmooth composite problems. Based on the proposed MAP property, we propose a globally convergent algorithm called Model BPG, that unifies several existing algorithms. The convergence analysis is based on a new Lyapunov function. We also numerically illustrate the superior performance of Model BPG on standard phase retrieval problems, robust phase retrieval problems, and Poisson linear inverse problems, when compared to a state of the art optimization method that is valid for generic nonconvex nonsmooth optimization problems.

View on arXiv
Comments on this paper