206
41

The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization

Abstract

We propose a novel high-dimensional linear regression estimator: the Discrete Dantzig Selector, which minimizes the number of nonzero regression coefficients, subject to a budget on the maximal absolute correlation between the features and residuals. We show that the estimator can be expressed as a solution to a Mixed Integer Linear Optimization (MILO) problem, a computationally tractable framework that delivers provably optimal global solutions. The current state of algorithmics in integer optimization makes our proposal substantially more scalable than the least squares subset selection framework based on integer quadratic optimization, recently proposed in [7] and the continuous nonconvex quadratic optimization framework of [33]. We propose new discrete first-order methods, which, when paired with state-of-the-art MILO solvers, lead to superior upper bounds for the Discrete Dantzig Selector problem for a given computational budget. We demonstrate that the integrated approach, proposed herein, also provides globally optimal solutions in significantly shorter computation times, when compared to off-the-shelf MILO solvers. We demonstrate, both theoretically and empirically, that, in a wide range of regimes, the statistical properties of the Discrete Dantzig Selector are superior to those of popular 1\ell_{1}-based approaches. Our approach gracefully scales to problem instances up to p = 10,000 features with provable optimality, making it, to the best of our knowledge, one of the most scalable exact variable selection approaches in sparse linear modeling at the moment.

View on arXiv
Comments on this paper