ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.09718
19
9

Safe Screening for the Generalized Conditional Gradient Method

22 February 2020
Yifan Sun
Francis R. Bach
ArXivPDFHTML
Abstract

The conditional gradient method (CGM) has been widely used for fast sparse approximation, having a low per iteration computational cost for structured sparse regularizers. We explore the sparsity acquiring properties of a generalized CGM (gCGM), where the constraint is replaced by a penalty function based on a gauge penalty; this can be done without significantly increasing the per-iteration computation, and applies to general notions of sparsity. Without assuming bounded iterates, we show O(1/t)O(1/t)O(1/t) convergence of the function values and gap of gCGM. We couple this with a safe screening rule, and show that at a rate O(1/(tδ2))O(1/(t\delta^2))O(1/(tδ2)), the screened support matches the support at the solution, where δ≥0\delta \geq 0δ≥0 measures how close the problem is to being degenerate. In our experiments, we show that the gCGM for these modified penalties have similar feature selection properties as common penalties, but with potentially more stability over the choice of hyperparameter.

View on arXiv
Comments on this paper