An Extragradient-Based Alternating Direction Method for Convex
Minimization
In this paper, we consider the problem of minimizing the sum of two convex functions subject to linear linking constraints. The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings. However, many problems arising from statistics, image processing and other fields have the structure that only one of the two functions has easy proximal mapping, and the other one is smoothly convex but does not have an easy proximal mapping. Therefore, the classical alternating direction methods cannot be applied. For solving this kind of problems, we propose in this paper an alternating direction method based on extragradients. Under the assumption that the smooth function has a Lipschitz continuous gradient, we prove that the proposed method returns an -optimal solution within iterations. We test the performance of different variants of the proposed method through solving the basis pursuit problem arising from compressed sensing. We then apply the proposed method to solve a new statistical model called fused logistic regression. Our numerical experiments show that the proposed method performs very well when solving the test problems.
View on arXiv