SOTAVerified

Analysing Rescaling, Discretization, and Linearization in RNNs for Neural System Modelling

2023-12-26Unverified0· sign in to hype

Mariano Caruso, Cecilia Jarne

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recurrent Neural Networks (RNNs) are widely used for modelling neural activity, yet the mathematical interplay of core procedures is used to analyze them (temporal rescaling, discretization, and linearization) remain uncharacterized. This study establishes the conditions under which these procedures commute, enabling flexible application in computational neuroscience. We rigorously analyze the mathematical foundations of the three procedures, formalizing their application to continuous-time RNN dynamics governed by differential equations. By deriving transformed equations under rescaling, discretization, and linearization, we determine commutativity criteria and evaluate their effects on network stability, numerical implementation, and linear approximations. We demonstrate that rescaling and discretization commute when time-step adjustments align with scaling factors. Similarly, linearization and discretization (or rescaling) yield equivalent dynamics regardless of order, provided activation functions operate near equilibrium points. Our findings directly guide the design of biologically plausible RNNs for simulating neural dynamics in decision-making and motor control, where temporal alignment and stability are critical

Tasks

Reproductions