Second-order Conditional Gradient Sliding

Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the \emph{Second-Order Conditional Gradient Sliding} (SOCGS) algorithm, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires first-order and Hessian oracle calls and linear minimization oracle calls to achieve an -optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly.
View on arXiv@article{carderera2025_2002.08907, title={ Second-order Conditional Gradient Sliding }, author={ Alejandro Carderera and Sebastian Pokutta }, journal={arXiv preprint arXiv:2002.08907}, year={ 2025 } }