Difference vs. Quotient: A Novel Algorithm for Dominant Eigenvalue Problem

The computation of the dominant eigenvector of symmetric positive semidefinite matrices is a cornerstone operation in numerous machine learning applications. Traditional approaches predominantly rely on the constrained Quotient formulation, which underpins most existing methods. However, these methods often suffer from challenges related to computational efficiency and dependence on spectral prior knowledge. This paper introduces a novel perspective by reformulating the eigenvalue problem using an unconstrained Difference formulation. This new approach sheds light on classical methods, revealing that the power method can be interpreted as a specific instance of Difference of Convex Algorithms. Building on this insight, we develop a generalized family of Difference-Type methods, which encompasses the power method as a special case. Within this family, we propose the Split-Merge algorithm, which achieves maximal acceleration without spectral prior knowledge and operates solely through matrix-vector products, making it both efficient and easy to implement. Extensive empirical evaluations on both synthetic and real-world datasets highlight that the Split-Merge algorithm achieves over a speedup compared to the basic power method, offering significant advancements in efficiency and practicality for large-scale machine learning problems.
View on arXiv@article{liu2025_2501.15131, title={ Split-Merge: A Difference-based Approach for Dominant Eigenvalue Problem }, author={ Xiaozhi Liu and Yong Xia }, journal={arXiv preprint arXiv:2501.15131}, year={ 2025 } }