On the Performance Analysis of Momentum Method: A Frequency Domain Perspective

Momentum-based optimizers are widely adopted for training neural networks. However, the optimal selection of momentum coefficients remains elusive. This uncertainty impedes a clear understanding of the role of momentum in stochastic gradient methods. In this paper, we present a frequency domain analysis framework that interprets the momentum method as a time-variant filter for gradients, where adjustments to momentum coefficients modify the filter characteristics. Our experiments support this perspective and provide a deeper understanding of the mechanism involved. Moreover, our analysis reveals the following significant findings: high-frequency gradient components are undesired in the late stages of training; preserving the original gradient in the early stages, and gradually amplifying low-frequency gradient components during training both enhance performance. Based on these insights, we propose Frequency Stochastic Gradient Descent with Momentum (FSGDM), a heuristic optimizer that dynamically adjusts the momentum filtering characteristic with an empirically effective dynamic magnitude response. Experimental results demonstrate the superiority of FSGDM over conventional momentum optimizers.
View on arXiv@article{li2025_2411.19671, title={ On the Performance Analysis of Momentum Method: A Frequency Domain Perspective }, author={ Xianliang Li and Jun Luo and Zhiwei Zheng and Hanxiao Wang and Li Luo and Lingkun Wen and Linlong Wu and Sheng Xu }, journal={arXiv preprint arXiv:2411.19671}, year={ 2025 } }