SOTAVerified

Minimum Risk Training for Neural Machine Translation

2015-12-08ACL 2016Unverified0· sign in to hype

Shiqi Shen, Yong Cheng, Zhongjun He, wei he, Hua Wu, Maosong Sun, Yang Liu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose minimum risk training for end-to-end neural machine translation. Unlike conventional maximum likelihood estimation, minimum risk training is capable of optimizing model parameters directly with respect to arbitrary evaluation metrics, which are not necessarily differentiable. Experiments show that our approach achieves significant improvements over maximum likelihood estimation on a state-of-the-art neural machine translation system across various languages pairs. Transparent to architectures, our approach can be applied to more neural networks and potentially benefit more NLP tasks.

Tasks

Reproductions