Deep Attentive Sentence Ordering Network
2018-10-01EMNLP 2018Unverified0· sign in to hype
Baiyun Cui, Yingming Li, Ming Chen, Zhongfei Zhang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
In this paper, we propose a novel deep attentive sentence ordering network (referred as ATTOrderNet) which integrates self-attention mechanism with LSTMs in the encoding of input sentences. It enables us to capture global dependencies among sentences regardless of their input order and obtains a reliable representation of the sentence set. With this representation, a pointer network is exploited to generate an ordered sequence. The proposed model is evaluated on Sentence Ordering and Order Discrimination tasks. The extensive experimental results demonstrate its effectiveness and superiority to the state-of-the-art methods.