60
42

Structured Sparse Regression via Greedy Hard-Thresholding

Abstract

Several learning applications require solving high-dimensional regression problems where the relevant features belong to a small number of (overlapping) groups. For very large datasets, hard thresholding methods have proven to be extremely efficient under standard sparsity assumptions, but such methods require NP hard projections when dealing with overlapping groups. In this paper, we propose a simple and efficient method that avoids NP-hard projections by using greedy approaches. Our proposed methods come with strong theoretical guarantees even in the presence of poorly conditioned data, exhibit an interesting computation-accuracy trade-off and can be extended to significantly harder problems such as sparse overlapping groups. Experiments on both real and synthetic data validate our claims and demonstrate that the proposed methods are significantly faster than the best known greedy and convex relaxation techniques for learning with structured sparsity.

View on arXiv
Comments on this paper