SOTAVerified

Variational Adaptive Noise and Dropout towards Stable Recurrent Neural Networks

2025-06-02Unverified0· sign in to hype

Taisuke Kobayashi, Shingo Murata

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper proposes a novel stable learning theory for recurrent neural networks (RNNs), so-called variational adaptive noise and dropout (VAND). As stabilizing factors for RNNs, noise and dropout on the internal state of RNNs have been separately confirmed in previous studies. We reinterpret the optimization problem of RNNs as variational inference, showing that noise and dropout can be derived simultaneously by transforming the explicit regularization term arising in the optimization problem into implicit regularization. Their scale and ratio can also be adjusted appropriately to optimize the main objective of RNNs, respectively. In an imitation learning scenario with a mobile manipulator, only VAND is able to imitate sequential and periodic behaviors as instructed. https://youtu.be/UOho3Xr6A2w

Tasks

Reproductions