114
v1v2v3v4 (latest)

On the dimension of pullback attractors in recurrent neural networks

Main:19 Pages
1 Figures
Bibliography:4 Pages
1 Tables
Abstract

Recurrent neural networks trained via the reservoir computing paradigm have demonstrated remarkable success in learning and reconstructing attractors from chaotic systems, often replicating quantities such as Lyapunov exponents and fractal dimensions. It has recently been conjectured that this is because the reservoir computer embeds the dynamics of the chaotic system in its state space before learning. This conjecture has been established for reservoir computers with linear activation functions and remains open for more general reservoir systems. In this work, we employ a non-autonomous dynamical systems approach to establish an upper bound for the box-counting dimension of the pullback attractor, a subset of the reservoir state space that is approximated during training and prediction phases. We prove that the box-counting dimension of the pullback attractor is bounded above by the box-counting dimension of the space of input sequences with respect to the product topology. In particular, for input sequences originating from an Nin-dimensional smooth dynamical system or their generic continuously differentiable observations, the box-counting dimension of the pullback attractor is bounded above by Nin. The results obtained here highlight the fact that, while a reservoir computer may possess a very high-dimensional state space, it exhibits effective low-dimensional dynamics. Our findings also partly explain why reservoir computers are successful in tasks such as attractor reconstruction and the computation of dynamic invariants like Lyapunov exponents and fractal dimensions.

View on arXiv
Comments on this paper