We consider the task of robust non-linear estimation in the presence of both bounded noise and outliers. Assuming that the unknown non-linear function belongs to a Reproducing Kernel Hilbert Space (RKHS), our goal is to accurately estimate the coefficients of the kernel regression matrix. Due to the existence of outliers, common techniques such as the Kernel Ridge Regression (KRR), or the Support Vector Regression (SVR) turn out to be inadequate. Instead, we employ sparse modeling arguments to model and estimate the outliers, adopting a greedy approach. In particular, the proposed robust scheme, i.e., Kernel Greedy Algorithm for Robust Denoising (KGARD), is a modification of the classical Orthogonal Matching Pursuit (OMP) algorithm. In a nutshell, the proposed scheme alternates between a KRR task and an OMP-like selection step. Convergence properties as well as theoretical results concerning the identification of the outliers are provided. Moreover, KGARD is compared against other cutting edge methods (using toy examples) to demonstrate its performance and verify the aforementioned theoretical results. Finally, the proposed robust estimation framework is applied to the task of image denoising, showing that it can enhance the denoising process significantly, when outliers are present.
View on arXiv