Fast rates for noisy clustering

Abstract
The effect of errors in variables in empirical minimization is investigated. Given a loss and a set of decision rules , we prove a general upper bound for an empirical minimization based on a deconvolution kernel and a noisy sample . We apply this general upper bound to give the rate of convergence for the expected excess risk in noisy clustering. A recent bound from \citet{levrard} proves that this rate is in the direct case, under Pollard's regularity assumptions. Here the effect of noisy measurements gives a rate of the form , where is the H\"older regularity of the density of whereas is the degree of illposedness.
View on arXivComments on this paper