Published in

American Institute of Physics, Chaos: An Interdisciplinary Journal of Nonlinear Science, 11(32), p. 113107, 2022

DOI: 10.1063/5.0116784

Links

Tools

Export citation

Search in Google Scholar

Learning unseen coexisting attractors

Journal article published in 2022 by Daniel J. Gauthier ORCID, Ingo Fischer ORCID, André Röhm ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Orange circle
Published version: archiving restricted
Data provided by SHERPA/RoMEO

Abstract

Reservoir computing is a machine learning approach that can generate a surrogate model of a dynamical system. It can learn the underlying dynamical system using fewer trainable parameters and, hence, smaller training data sets than competing approaches. Recently, a simpler formulation, known as next-generation reservoir computing, removed many algorithm metaparameters and identified a well-performing traditional reservoir computer, thus simplifying training even further. Here, we study a particularly challenging problem of learning a dynamical system that has both disparate time scales and multiple co-existing dynamical states (attractors). We compare the next-generation and traditional reservoir computer using metrics quantifying the geometry of the ground-truth and forecasted attractors. For the studied four-dimensional system, the next-generation reservoir computing approach uses [Formula: see text] less training data, requires [Formula: see text] shorter “warmup” time, has fewer metaparameters, and has an [Formula: see text] higher accuracy in predicting the co-existing attractor characteristics in comparison to a traditional reservoir computer. Furthermore, we demonstrate that it predicts the basin of attraction with high accuracy. This work lends further support to the superior learning ability of this new machine learning algorithm for dynamical systems.