ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.03257
27
7

Revisiting Nearest Neighbor for Tabular Data: A Deep Tabular Baseline Two Decades Later

3 July 2024
Han-Jia Ye
Huai-Hong Yin
De-Chuan Zhan
Wei-Lun Chao
ArXivPDFHTML
Abstract

The widespread enthusiasm for deep learning has recently expanded into the domain of tabular data. Recognizing that the advancement in deep tabular methods is often inspired by classical methods, e.g., integration of nearest neighbors into neural networks, we investigate whether these classical methods can be revitalized with modern techniques. We revisit a differentiable version of KKK-nearest neighbors (KNN) -- Neighbourhood Components Analysis (NCA) -- originally designed to learn a linear projection to capture semantic similarities between instances, and seek to gradually add modern deep learning techniques on top. Surprisingly, our implementation of NCA using SGD and without dimensionality reduction already achieves decent performance on tabular data, in contrast to the results of using existing toolboxes like scikit-learn. Further equipping NCA with deep representations and additional training stochasticity significantly enhances its capability, being on par with the leading tree-based method CatBoost and outperforming existing deep tabular models in both classification and regression tasks on 300 datasets. We conclude our paper by analyzing the factors behind these improvements, including loss functions, prediction strategies, and deep architectures. The code is available atthis https URL.

View on arXiv
@article{ye2025_2407.03257,
  title={ Revisiting Nearest Neighbor for Tabular Data: A Deep Tabular Baseline Two Decades Later },
  author={ Han-Jia Ye and Huai-Hong Yin and De-Chuan Zhan and Wei-Lun Chao },
  journal={arXiv preprint arXiv:2407.03257},
  year={ 2025 }
}
Comments on this paper