92
0

Learning Beyond Experience: Generalizing to Unseen State Space with Reservoir Computing

Abstract

Machine learning techniques offer an effective approach to modeling dynamical systems solely from observed data. However, without explicit structural priors -- built-in assumptions about the underlying dynamics -- these techniques typically struggle to generalize to aspects of the dynamics that are poorly represented in the training data. Here, we demonstrate that reservoir computing -- a simple, efficient, and versatile machine learning framework often used for data-driven modeling of dynamical systems -- can generalize to unexplored regions of state space without explicit structural priors. First, we describe a multiple-trajectory training scheme for reservoir computers that supports training across a collection of disjoint time series, enabling effective use of available training data. Then, applying this training scheme to multistable dynamical systems, we show that RCs trained on trajectories from a single basin of attraction can achieve out-of-domain generalization by capturing system behavior in entirely unobserved basins.

View on arXiv
@article{norton2025_2506.05292,
  title={ Learning Beyond Experience: Generalizing to Unseen State Space with Reservoir Computing },
  author={ Declan A. Norton and Yuanzhao Zhang and Michelle Girvan },
  journal={arXiv preprint arXiv:2506.05292},
  year={ 2025 }
}
Comments on this paper