We consider the convex-concave saddle point problem where is smooth and convex and is smooth and strongly convex. We prove that if the coupling matrix has full column rank, the vanilla primal-dual gradient method can achieve linear convergence even if is not strongly convex. Our result generalizes previous work which either requires and to be quadratic functions or requires proximal mappings for both and . We adopt a novel analysis technique that in each iteration uses a "ghost" update as a reference, and show that the iterates in the primal-dual gradient method converge to this "ghost" sequence. Using the same technique we further give an analysis for the primal-dual stochastic variance reduced gradient (SVRG) method for convex-concave saddle point problems with a finite-sum structure.
View on arXiv