SOTAVerified

Modal Dependency Parsing via Language Model Priming

2022-07-01NAACL 2022Code Available0· sign in to hype

Jiarui Yao, Nianwen Xue, Bonan Min

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The task of modal dependency parsing aims to parse a text into its modal dependency structure, which is a representation for the factuality of events in the text. We design a modal dependency parser that is based on priming pre-trained language models, and evaluate the parser on two data sets. Compared to baselines, we show an improvement of 2.6% in F-score for English and 4.6% for Chinese. To the best of our knowledge, this is also the first work on Chinese modal dependency parsing.

Tasks

Reproductions