SOTAVerified

Learning Beyond Experience: Generalizing to Unseen State Space with Reservoir Computing

2025-06-05Code Available0· sign in to hype

Declan A. Norton, Yuanzhao Zhang, Michelle Girvan

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Machine learning techniques offer an effective approach to modeling dynamical systems solely from observed data. However, without explicit structural priors -- built-in assumptions about the underlying dynamics -- these techniques typically struggle to generalize to aspects of the dynamics that are poorly represented in the training data. Here, we demonstrate that reservoir computing -- a simple, efficient, and versatile machine learning framework often used for data-driven modeling of dynamical systems -- can generalize to unexplored regions of state space without explicit structural priors. First, we describe a multiple-trajectory training scheme for reservoir computers that supports training across a collection of disjoint time series, enabling effective use of available training data. Then, applying this training scheme to multistable dynamical systems, we show that RCs trained on trajectories from a single basin of attraction can achieve out-of-domain generalization by capturing system behavior in entirely unobserved basins.

Tasks

Reproductions