Fast algorithm for sparse least trimmed squares via trimmed-regularized
reformulation
Shotaro Yagishita
Main:13 Pages
2 Figures
Bibliography:2 Pages
1 Tables
Abstract
The least trimmed squares (LTS) is a reasonable formulation of robust regression whereas it suffers from high computational cost due to the nonconvexity and nonsmoothness of its objective function. The most frequently used FAST-LTS algorithm is particularly slow when a sparsity-inducing penalty such as the norm is added. This paper proposes a computationally inexpensive algorithm for the sparse LTS, which is based on the proximal gradient method with a reformulation technique. Proposed method is equipped with theoretical convergence preferred over existing methods. Numerical experiments show that our method efficiently yields small objective value.
View on arXivComments on this paper
