SOTAVerified

Enhanced Transformer architecture for in-context learning of dynamical systems

2024-10-04Code Available0· sign in to hype

Matteo Rufolo, Dario Piga, Gabriele Maroni, Marco Forgione

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently introduced by some of the authors, the in-context identification paradigm aims at estimating, offline and based on synthetic data, a meta-model that describes the behavior of a whole class of systems. Once trained, this meta-model is fed with an observed input/output sequence (context) generated by a real system to predict its behavior in a zero-shot learning fashion. In this paper, we enhance the original meta-modeling framework through three key innovations: by formulating the learning task within a probabilistic framework; by managing non-contiguous context and query windows; and by adopting recurrent patching to effectively handle long context sequences. The efficacy of these modifications is demonstrated through a numerical example focusing on the Wiener-Hammerstein system class, highlighting the model's enhanced performance and scalability.

Tasks

Reproductions