ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.02578
11
0
v1v2v3v4 (latest)

Differentiable Greedy Submodular Maximization: Guarantees, Gradient Estimators, and Applications

6 May 2020
Shinsaku Sakaue
ArXiv (abs)PDFHTML
Abstract

Motivated by, e.g., sensitivity analysis and end-to-end learning, the demand for differentiable optimization algorithms has been significantly increasing. In this paper, we establish a theoretically guaranteed versatile framework that makes the greedy algorithm for monotone submodular function maximization differentiable. We smooth the greedy algorithm via randomization, and prove that it almost recovers original approximation guarantees in expectation for the cases of cardinality and κ\kappaκ-extensible system constrains. We also show how to efficiently compute unbiased gradient estimators of any expected output-dependent quantities. We demonstrate the usefulness of our framework by instantiating it for various applications.

View on arXiv
Comments on this paper