ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1511.06566
30
7
v1v2 (latest)

Acceleration of the PDHGM on strongly convex subspaces

20 November 2015
T. Valkonen
Thomas Pock
ArXiv (abs)PDFHTML
Abstract

We propose several variants of the primal-dual method due to Chambolle and Pock. Without requiring full strong convexity of the objective functions, our methods are accelerated on subspaces with strong convexity. This yields mixed rates, O(1/N2)O(1/N^2)O(1/N2) with respect to initialisation and O(1/N)O(1/N)O(1/N) with respect to the dual sequence, and the residual part of the primal sequence. We demonstrate the efficacy of the proposed methods on image processing problems lacking strong convexity, such as total generalised variation denoising and total variation deblurring.

View on arXiv
Comments on this paper