ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.06967
11
46

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

14 May 2020
Allen G. Hart
J. Hook
Jonathan H.P Dawes
ArXivPDFHTML
Abstract

Echo State Networks (ESNs) are a class of single-layer recurrent neural networks with randomly generated internal weights, and a single layer of tuneable outer weights, which are usually trained by regularised linear least squares regression. Remarkably, ESNs still enjoy the universal approximation property despite the training procedure being entirely linear. In this paper, we prove that an ESN trained on a sequence of observations from an ergodic dynamical system (with invariant measure μ\muμ) using Tikhonov least squares regression against a set of targets, will approximate the target function in the L2(μ)L^2(\mu)L2(μ) norm. In the special case that the targets are future observations, the ESN is learning the next step map, which allows time series forecasting. We demonstrate the theory numerically by training an ESN using Tikhonov least squares on a sequence of scalar observations of the Lorenz system.

View on arXiv
Comments on this paper