224

Accelerated Stochastic Gradient-free and Projection-free Methods

International Conference on Machine Learning (ICML), 2020
Abstract

In the paper, we propose a class of accelerated stochastic gradient-free and projection-free (a.k.a., zeroth-order Frank-Wolfe) methods to solve the constrained stochastic and finite-sum nonconvex optimization. Specifically, we propose an accelerated stochastic zeroth-order Frank-Wolfe (Acc-SZOFW) method based on the variance reduced technique of SPIDER/SpiderBoost and a novel momentum accelerated technique. Moreover, under some mild conditions, we prove that the Acc-SZOFW has the function query complexity of O(dnϵ2)O(d\sqrt{n}\epsilon^{-2}) for finding an ϵ\epsilon-stationary point in the finite-sum problem, which improves the exiting best result by a factor of O(nϵ2)O(\sqrt{n}\epsilon^{-2}), and has the function query complexity of O(dϵ3)O(d\epsilon^{-3}) in the stochastic problem, which improves the exiting best result by a factor of O(ϵ1)O(\epsilon^{-1}). To relax the large batches required in the Acc-SZOFW, we further propose a novel accelerated stochastic zeroth-order Frank-Wolfe (Acc-SZOFW*) based on a new variance reduced technique of STORM, which still reaches the function query complexity of O(dϵ3)O(d\epsilon^{-3}) in the stochastic problem without relying on any large batches. In particular, we present an accelerated framework of the Frank-Wolfe methods based on the proposed momentum accelerated technique. The extensive experimental results on black-box adversarial attack and robust black-box classification demonstrate the efficiency of our algorithms.

View on arXiv
Comments on this paper