In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called Coordinate-Ascent+, achieves a -approximation guarantee while performing iterations, where the computational complexity of each iteration is roughly (here, denotes the dimension of the optimization problem). We then propose Coordinate-Ascent++, that achieves the tight -approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly per iteration. However, the computation of each round of Coordinate-Ascent++ can be easily parallelized so that the computational cost per machine scales as .
View on arXiv