124

Improved Analysis for Sign-based Methods with Momentum Updates

Wei Jiang
Dingzhi Yu
Sifan Yang
Wenhao Yang
Lijun Zhang
Main:9 Pages
4 Figures
Bibliography:2 Pages
2 Tables
Appendix:9 Pages
Abstract

In this paper, we present enhanced analysis for sign-based optimization algorithms with momentum updates. Traditional sign-based methods, under the separable smoothness assumption, guarantee a convergence rate of O(T1/4)\mathcal{O}(T^{-1/4}), but they either require large batch sizes or assume unimodal symmetric stochastic noise. To address these limitations, we demonstrate that signSGD with momentum can achieve the same convergence rate using constant batch sizes without additional assumptions. Our analysis, under the standard l2l_2-smoothness condition, improves upon the result of the prior momentum-based signSGD method by a factor of O(d1/2)\mathcal{O}(d^{1/2}), where dd is the problem dimension. Furthermore, we explore sign-based methods with majority vote in distributed settings and show that the proposed momentum-based method yields convergence rates of O(d1/2T1/2+dn1/2)\mathcal{O}\left( d^{1/2}T^{-1/2} + dn^{-1/2} \right) and O(max{d1/4T1/4,d1/10T1/5})\mathcal{O}\left( \max \{ d^{1/4}T^{-1/4}, d^{1/10}T^{-1/5} \} \right), which outperform the previous results of O(dT1/4+dn1/2)\mathcal{O}\left( dT^{-1/4} + dn^{-1/2} \right) and O(d3/8T1/8)\mathcal{O}\left( d^{3/8}T^{-1/8} \right), respectively. Numerical experiments further validate the effectiveness of the proposed methods.

View on arXiv
Comments on this paper