Attractor-merging Crises and Intermittency in Reservoir Computing

Abstract
Reservoir computing can embed attractors into random neural networks (RNNs), generating a ``mirror'' of a target attractor because of its inherent symmetrical constraints. In these RNNs, we report that an attractor-merging crisis accompanied by intermittency emerges simply by adjusting the global parameter. We further reveal its underlying mechanism through a detailed analysis of the phase-space structure and demonstrate that this bifurcation scenario is intrinsic to a general class of RNNs, independent of training data.
View on arXiv@article{kabayama2025_2504.12695, title={ Attractor-merging Crises and Intermittency in Reservoir Computing }, author={ Tempei Kabayama and Motomasa Komuro and Yasuo Kuniyoshi and Kazuyuki Aihara and Kohei Nakajima }, journal={arXiv preprint arXiv:2504.12695}, year={ 2025 } }
Comments on this paper