SOTAVerified

Multi-level Head-wise Match and Aggregation in Transformer for Textual Sequence Matching

2020-01-20Unverified0· sign in to hype

Shuohang Wang, Yunshi Lan, Yi Tay, Jing Jiang, Jingjing Liu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vector-representation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary.

Tasks

Reproductions