43
2

Adaptive Nearest Neighbor: A General Framework for Distance Metric Learning

Abstract

KK-NN classifier is one of the most famous classification algorithms, whose performance is crucially dependent on the distance metric. When we consider the distance metric as a parameter of KK-NN, learning an appropriate distance metric for KK-NN can be seen as minimizing the empirical risk of KK-NN. In this paper, we design a new type of continuous decision function of the KK-NN classification rule which can be used to construct the continuous empirical risk function of KK-NN. By minimizing this continuous empirical risk function, we obtain a novel distance metric learning algorithm named as adaptive nearest neighbor (ANN). We have proved that the current algorithms such as the large margin nearest neighbor (LMNN), neighbourhood components analysis (NCA) and the pairwise constraint methods are special cases of the proposed ANN by setting the parameter different values. Compared with the LMNN, NCA, and pairwise constraint methods, our method has a broader searching space which may contain better solutions. At last, extensive experiments on various data sets are conducted to demonstrate the effectiveness and efficiency of the proposed method.

View on arXiv
Comments on this paper