Complexity of Minimizing Projected-Gradient-Dominated Functions with Stochastic First-order Oracles

This work investigates the performance limits of projected stochastic first-order methods for minimizing functions under the -projected-gradient-dominance property, that asserts the sub-optimality gap is upper-bounded by for some and and is the projected-gradient mapping with as a parameter. For non-convex functions, we show that the complexity lower bound of querying a batch smooth first-order stochastic oracle to obtain an -global-optimum point is . Furthermore, we show that a projected variance-reduced first-order algorithm can obtain the upper complexity bound of , matching the lower bound. For convex functions, we establish a complexity lower bound of for minimizing functions under a local version of gradient-dominance property, which also matches the upper complexity bound of accelerated stochastic subgradient methods.
View on arXiv