385

Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains

Abstract

Submodular continuous functions are a category of (generally) non-convex/non-concave functions with a wide spectrum of applications. We characterize these functions and demonstrate that they can be maximized efficiently with approximation guarantees. Specifically, i) we propose the weak DR property that gives a unified characterization of submodularity for set, lattice and continuous functions; ii) for maximizing monotone submodular continuous functions with an additional diminishing returns property under down-closed convex constraints, we propose a Frank-Wolfe style algorithm with (11/e)(1-1/e)-approximation, and sub-linear convergence rate; iii) for maximizing general non-monotone submodular continuous functions subject to box constraints, we propose a DoubleGreedy algorithm with 1/31/3-approximation. Submodular continuous functions naturally find applications in various real-world settings, including influence and revenue maximization with continuous assignments, sensor energy management, multi-resolution data summarization, facility location, etc. Experimental results show that the proposed algorithms efficiently generate superior solutions compared to baseline algorithms.

View on arXiv
Comments on this paper