20
3

Deletion Robust Non-Monotone Submodular Maximization over Matroids

Abstract

Maximizing a submodular function is a fundamental task in machine learning and in this paper we study the deletion robust version of the problem under the classic matroids constraint. Here the goal is to extract a small size summary of the dataset that contains a high value independent set even after an adversary deleted some elements. We present constant-factor approximation algorithms, whose space complexity depends on the rank kk of the matroid and the number dd of deleted elements. In the centralized setting we present a (4.597+O(ε))(4.597+O(\varepsilon))-approximation algorithm with summary size O(k+dε2logkε)O( \frac{k+d}{\varepsilon^2}\log \frac{k}{\varepsilon}) that is improved to a (3.582+O(ε))(3.582+O(\varepsilon))-approximation with O(k+dε2logkε)O(k + \frac{d}{\varepsilon^2}\log \frac{k}{\varepsilon}) summary size when the objective is monotone. In the streaming setting we provide a (9.435+O(ε))(9.435 + O(\varepsilon))-approximation algorithm with summary size and memory O(k+dε2logkε)O(k + \frac{d}{\varepsilon^2}\log \frac{k}{\varepsilon}); the approximation factor is then improved to (5.582+O(ε))(5.582+O(\varepsilon)) in the monotone case.

View on arXiv
Comments on this paper