6
12

Second-order Conditional Gradient Sliding

Abstract

Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the \emph{Second-Order Conditional Gradient Sliding} (SOCGS) algorithm, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires O(log(log1/ε))\mathcal{O}(\log(\log 1/\varepsilon)) first-order and Hessian oracle calls and O(log(1/ε)log(log1/ε))\mathcal{O}(\log (1/\varepsilon) \log(\log1/\varepsilon)) linear minimization oracle calls to achieve an ε\varepsilon-optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly.

View on arXiv
@article{carderera2025_2002.08907,
  title={ Second-order Conditional Gradient Sliding },
  author={ Alejandro Carderera and Sebastian Pokutta },
  journal={arXiv preprint arXiv:2002.08907},
  year={ 2025 }
}
Comments on this paper