SOTAVerified

Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

2018-11-02ACL 2017Code Available0· sign in to hype

Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge. In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation. We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model. Experiments on Chinese-English translation show that our approach leads to significant improvements.

Tasks

Reproductions