SOTAVerified

Residual Reservoir Memory Networks

2026-01-29Code Available0· sign in to hype

Matteo Pinna, Andrea Ceni, Claudio Gallicchio

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.

Reproductions