SOTAVerified

Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence Embedding

2018-08-22Code Available1· sign in to hype

Deunsol Yoon, Dongbok Lee, SangKeun Lee

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we propose Dynamic Self-Attention (DSA), a new self-attention mechanism for sentence embedding. We design DSA by modifying dynamic routing in capsule network (Sabouretal.,2017) for natural language processing. DSA attends to informative words with a dynamic weight vector. We achieve new state-of-the-art results among sentence encoding methods in Stanford Natural Language Inference (SNLI) dataset with the least number of parameters, while showing comparative results in Stanford Sentiment Treebank (SST) dataset.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
SNLI2400D Multiple-Dynamic Self-Attention Model% Test Accuracy87.4Unverified
SNLI600D Dynamic Self-Attention Model% Test Accuracy86.8Unverified

Reproductions