We design accelerated algorithms with improved rates for several fundamental classes of optimization problems. Our algorithms all build upon techniques related to the analysis of primal-dual extragradient methods via relative Lipschitzness proposed recently by [CST21]. (1) Separable minimax optimization. We study separable minimax optimization problems , where and have smoothness and strong convexity parameters , , and is convex-concave with a -blockwise operator norm bounded Hessian. We provide an algorithm with gradient query complexity . Notably, for convex-concave minimax problems with bilinear coupling (e.g.\ quadratics), where , our rate matches a lower bound of [ZHZ19]. (2) Finite sum optimization. We study finite sum optimization problems , where each is -smooth and the overall problem is -strongly convex. We provide an algorithm with gradient query complexity . Notably, when the smoothness bounds are non-uniform, our rate improves upon accelerated SVRG [LMH15, FGKS15] and Katyusha [All17] by up to a factor. (3) Minimax finite sums. We generalize our algorithms for minimax and finite sum optimization to solve a natural family of minimax finite sum optimization problems at an accelerated rate, encapsulating both above results up to a logarithmic factor.
View on arXiv