ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.14882
23
11

Reduced Effectiveness of Kolmogorov-Arnold Networks on Functions with Noise

20 July 2024
Haoran Shen
Chen Zeng
Jiahui Wang
Qiao Wang
ArXivPDFHTML
Abstract

It has been observed that even a small amount of noise introduced into the dataset can significantly degrade the performance of KAN. In this brief note, we aim to quantitatively evaluate the performance when noise is added to the dataset. We propose an oversampling technique combined with denoising to alleviate the impact of noise. Specifically, we employ kernel filtering based on diffusion maps for pre-filtering the noisy data for training KAN network. Our experiments show that while adding i.i.d. noise with any fixed SNR, when we increase the amount of training data by a factor of rrr, the test-loss (RMSE) of KANs will exhibit a performance trend like test-loss∼O(r−12)\text{test-loss} \sim \mathcal{O}(r^{-\frac{1}{2}})test-loss∼O(r−21​) as r→+∞r\to +\inftyr→+∞. We conclude that applying both oversampling and filtering strategies can reduce the detrimental effects of noise. Nevertheless, determining the optimal variance for the kernel filtering process is challenging, and enhancing the volume of training data substantially increases the associated costs, because the training dataset needs to be expanded multiple times in comparison to the initial clean data. As a result, the noise present in the data ultimately diminishes the effectiveness of Kolmogorov-Arnold networks.

View on arXiv
Comments on this paper