Trust-and-Verify Error Bounds for K-Nearest Neighbor Classifiers
Abstract
We show that -nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with nearly O() Gaussian-style bound ranges, similar to error bounds based on VC dimension for other types of classifiers that have simpler decision boundaries. Specifically, we present an exponential PAC error bound for -nearest neighbor classifiers that has O() error bound range, for in-sample examples and bound failure probability .
View on arXivComments on this paper
