110
v1v2 (latest)

Extrapolation Towards Imaginary 00-Nearest Neighbour and Its Improved Convergence Rate

Neural Information Processing Systems (NeurIPS), 2020
Abstract

kk-nearest neighbour (kk-NN) is one of the simplest and most widely-used methods for supervised classification, that predicts a query's label by taking weighted ratio of observed labels of kk objects nearest to the query. The weights and the parameter kNk \in \mathbb{N} regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the kk-NN classifier; several existing studies considered selecting optimal kk and weights to obtain faster convergence rate. Whereas kk-NN with non-negative weights has been developed widely, it was also proved that negative weights are essential for eradicating the bias terms and attaining optimal convergence rate. In this paper, we propose a novel multiscale kk-NN (MS-kk-NN), that extrapolates unweighted kk-NN estimators from several k1k \ge 1 values to k=0k=0, thus giving an imaginary 0-NN estimator. Our method implicitly computes optimal real-valued weights that are adaptive to the query and its neighbour points. We theoretically prove that the MS-kk-NN attains the improved rate, which coincides with the existing optimal rate under some conditions.

View on arXiv
Comments on this paper