Seq2seq Dependency Parsing
Zuchao Li, Jiaxun Cai, Shexia He, Hai Zhao
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/bcmi220/seq2seq_parserOfficialIn paperpytorch★ 0
- github.com/2023-MindSpore-1/ms-code-210/tree/main/crnn_seq2seq_ocrmindspore★ 0
- github.com/MindSpore-paper-code-3/code6/tree/main/crnn_seq2seq_ocrmindspore★ 0
Abstract
This paper presents a sequence to sequence (seq2seq) dependency parser by directly predicting the relative position of head for each given word, which therefore results in a truly end-to-end seq2seq dependency parser for the first time. Enjoying the advantage of seq2seq modeling, we enrich a series of embedding enhancement, including firstly introduced subword and node2vec augmentation. Meanwhile, we propose a beam search decoder with tree constraint and subroot decomposition over the sequence to furthermore enhance our seq2seq parser. Our parser is evaluated on benchmark treebanks, being on par with the state-of-the-art parsers by achieving 94.11\% UAS on PTB and 88.78\% UAS on CTB, respectively.