We consider the problem of minimizing a smooth and convex function over the -dimensional spectrahedron -- the set of real symmetric positive semidefinite matrices with unit trace, which underlies numerous applications in statistics, machine learning and additional domains. Standard first-order methods often require high-rank matrix computations which are prohibitive when the dimension is large. The well-known Frank-Wolfe method on the other hand, only requires efficient rank-one matrix computations, however suffers from worst-case slow convergence, even under conditions that enable linear convergence rates for standard methods. In this work we present the first Frank-Wolfe-based algorithm that only applies efficient rank-one matrix computations and, assuming quadratic growth and strict complementarity conditions, is guaranteed, after a finite number of iterations, to converges linearly, in expectation, and independently of the ambient dimension.
View on arXiv@article{garber2025_2503.01441, title={ A Linearly Convergent Frank-Wolfe-type Method for Smooth Convex Minimization over the Spectrahedron }, author={ Dan Garber }, journal={arXiv preprint arXiv:2503.01441}, year={ 2025 } }