SOTAVerified

Reproducing and Regularizing the SCRN Model

2018-08-01COLING 2018Code Available0· sign in to hype

Olzhas Kabdolov, Zhenisbek Assylbekov, Rustem Takhanov

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We reproduce the Structurally Constrained Recurrent Network (SCRN) model, and then regularize it using the existing widespread techniques, such as naive dropout, variational dropout, and weight tying. We show that when regularized and optimized appropriately the SCRN model can achieve performance comparable with the ubiquitous LSTM model in language modeling task on English data, while outperforming it on non-English data.

Tasks

Reproductions