Stochastic Momentum Methods for Non-smooth Non-Convex Finite-Sum Coupled Compositional Optimization

Finite-sum Coupled Compositional Optimization (FCCO), characterized by its coupled compositional objective structure, emerges as an important optimization paradigm for addressing a wide range of machine learning problems. In this paper, we focus on a challenging class of non-convex non-smooth FCCO, where the outer functions are non-smooth weakly convex or convex and the inner functions are smooth or weakly convex. Existing state-of-the-art result face two key limitations: (1) a high iteration complexity of under the assumption that the stochastic inner functions are Lipschitz continuous in expectation; (2) reliance on vanilla SGD-type updates, which are not suitable for deep learning applications. Our main contributions are two fold: (i) We propose stochastic momentum methods tailored for non-smooth FCCO that come with provable convergence guarantees; (ii) We establish a new state-of-the-art iteration complexity of . Moreover, we apply our algorithms to multiple inequality constrained non-convex optimization problems involving smooth or weakly convex functional inequality constraints. By optimizing a smoothed hinge penalty based formulation, we achieve a new state-of-the-art complexity of for finding an (nearly) -level KKT solution. Experiments on three tasks demonstrate the effectiveness of the proposed algorithms.
View on arXiv@article{chen2025_2506.02504, title={ Stochastic Momentum Methods for Non-smooth Non-Convex Finite-Sum Coupled Compositional Optimization }, author={ Xingyu Chen and Bokun Wang and Ming Yang and Quanqi Hu and Qihang Lin and Tianbao Yang }, journal={arXiv preprint arXiv:2506.02504}, year={ 2025 } }