SOTAVerified

Interactively-Propagative Attention Learning for Implicit Discourse Relation Recognition

2020-12-01COLING 2020Unverified0· sign in to hype

Huibin Ruan, Yu Hong, Yang Xu, Zhen Huang, Guodong Zhou, Min Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We tackle implicit discourse relation recognition. Both self-attention and interactive-attention mechanisms have been applied for attention-aware representation learning, which improves the current discourse analysis models. To take advantages of the two attention mechanisms simultaneously, we develop a propagative attention learning model using a cross-coupled two-channel network. We experiment on Penn Discourse Treebank. The test results demonstrate that our model yields substantial improvements over the baselines (BiLSTM and BERT).

Tasks

Reproductions