SOTAVerified

Two Heads are Better than One? Verification of Ensemble Effect in Neural Machine Translation

2021-11-01EMNLP (insights) 2021Unverified0· sign in to hype

Chanjun Park, Sungjin Park, Seolhwa Lee, Taesun Whang, Heuiseok Lim

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In the field of natural language processing, ensembles are broadly known to be effective in improving performance. This paper analyzes how ensemble of neural machine translation (NMT) models affect performance improvement by designing various experimental setups (i.e., intra-, inter-ensemble, and non-convergence ensemble). To an in-depth examination, we analyze each ensemble method with respect to several aspects such as different attention models and vocab strategies. Experimental results show that ensembling is not always resulting in performance increases and give noteworthy negative findings.

Tasks

Reproductions