SOTAVerified

Dense Information Flow for Neural Machine Translation

2018-06-03NAACL 2018Code Available0· sign in to hype

Yanyao Shen, Xu Tan, Di He, Tao Qin, Tie-Yan Liu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, neural machine translation has achieved remarkable progress by introducing well-designed deep neural networks into its encoder-decoder framework. From the optimization perspective, residual connections are adopted to improve learning performance for both encoder and decoder in most of these deep architectures, and advanced attention connections are applied as well. Inspired by the success of the DenseNet model in computer vision problems, in this paper, we propose a densely connected NMT architecture (DenseNMT) that is able to train more efficiently for NMT. The proposed DenseNMT not only allows dense connection in creating new features for both encoder and decoder, but also uses the dense attention structure to improve attention quality. Our experiments on multiple datasets show that DenseNMT structure is more competitive and efficient.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
WMT2014 English-GermanDenseNMTBLEU score25.52Unverified

Reproductions