SOTAVerified

Multi-task Attention-based Neural Networks for Implicit Discourse Relationship Representation and Identification

2017-09-01EMNLP 2017Unverified0· sign in to hype

Man Lan, Jianxiang Wang, Yuanbin Wu, Zheng-Yu Niu, Haifeng Wang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a novel multi-task attention based neural network model to address implicit discourse relationship representation and identification through two types of representation learning, an attention based neural network for learning discourse relationship representation with two arguments and a multi-task framework for learning knowledge from annotated and unannotated corpora. The extensive experiments have been performed on two benchmark corpora (i.e., PDTB and CoNLL-2016 datasets). Experimental results show that our proposed model outperforms the state-of-the-art systems on benchmark corpora.

Tasks

Reproductions