Chinese NER Using Lattice LSTM
2018-05-05ACL 2018Code Available1· sign in to hype
Yue Zhang, Jie Yang
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/jiesutd/LatticeLSTMOfficialIn paperpytorch★ 0
- github.com/Houlong66/lattice_lstm_with_pytorchpytorch★ 24
- github.com/LeeSureman/Batch_Parallel_LatticeLSTMpytorch★ 0
Abstract
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon. Compared with character-based methods, our model explicitly leverages word and word sequence information. Compared with word-based methods, lattice LSTM does not suffer from segmentation errors. Gated recurrent cells allow our model to choose the most relevant characters and words from a sentence for better NER results. Experiments on various datasets show that lattice LSTM outperforms both word-based and character-based LSTM baselines, achieving the best results.