Sequence Classification with Human Attention
2018-10-01CONLL 2018Code Available0· sign in to hype
Maria Barrett, Joachim Bingel, Nora Hollenstein, Marek Rei, Anders S{\o}gaard
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/coastalcph/Sequence_classification_with_human_attentionOfficialIn papertf★ 0
Abstract
Learning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP. Specifically, we use estimated human attention derived from eye-tracking corpora to regularize attention functions in recurrent neural networks. We show substantial improvements across a range of tasks, including sentiment analysis, grammatical error detection, and detection of abusive language.