On Training Bi-directional Neural Network Language Model with Noise Contrastive Estimation
2016-02-19Code Available0· sign in to hype
Tianxing He, Yu Zhang, Jasha Droppo, Kai Yu
Code Available — Be the first to reproduce this paper.
ReproduceCode
- bitbucket.org/cloudygoose/ptb_rescoreOfficialIn papernone★ 0
Abstract
We propose to train bi-directional neural network language model(NNLM) with noise contrastive estimation(NCE). Experiments are conducted on a rescore task on the PTB data set. It is shown that NCE-trained bi-directional NNLM outperformed the one trained by conventional maximum likelihood training. But still(regretfully), it did not out-perform the baseline uni-directional NNLM.