ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.12175
24
0

Approximation Bounds for Transformer Networks with Application to Regression

16 April 2025
Yuling Jiao
Yanming Lai
Defeng Sun
Yang Wang
Bokai Yan
ArXivPDFHTML
Abstract

We explore the approximation capabilities of Transformer networks for Hölder and Sobolev functions, and apply these results to address nonparametric regression estimation with dependent observations. First, we establish novel upper bounds for standard Transformer networks approximating sequence-to-sequence mappings whose component functions are Hölder continuous with smoothness index γ∈(0,1]\gamma \in (0,1]γ∈(0,1]. To achieve an approximation error ε\varepsilonε under the LpL^pLp-norm for p∈[1,∞]p \in [1, \infty]p∈[1,∞], it suffices to use a fixed-depth Transformer network whose total number of parameters scales as ε−dxn/γ\varepsilon^{-d_x n / \gamma}ε−dx​n/γ. This result not only extends existing findings to include the case p=∞p = \inftyp=∞, but also matches the best known upper bounds on number of parameters previously obtained for fixed-depth FNNs and RNNs. Similar bounds are also derived for Sobolev functions. Second, we derive explicit convergence rates for the nonparametric regression problem under various β\betaβ-mixing data assumptions, which allow the dependence between observations to weaken over time. Our bounds on the sample complexity impose no constraints on weight magnitudes. Lastly, we propose a novel proof strategy to establish approximation bounds, inspired by the Kolmogorov-Arnold representation theorem. We show that if the self-attention layer in a Transformer can perform column averaging, the network can approximate sequence-to-sequence Hölder functions, offering new insights into the interpretability of self-attention mechanisms.

View on arXiv
@article{jiao2025_2504.12175,
  title={ Approximation Bounds for Transformer Networks with Application to Regression },
  author={ Yuling Jiao and Yanming Lai and Defeng Sun and Yang Wang and Bokai Yan },
  journal={arXiv preprint arXiv:2504.12175},
  year={ 2025 }
}
Comments on this paper