SOTAVerified

Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

2016-05-30Code Available0· sign in to hype

Yang Liu, Chengjie Sun, Lei Lin, Xiaolong Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we proposed a sentence encoding-based model for recognizing text entailment. In our approach, the encoding of sentence is a two-stage process. Firstly, average pooling was used over word-level bidirectional LSTM (biLSTM) to generate a first-stage sentence representation. Secondly, attention mechanism was employed to replace average pooling on the same sentence for better representations. Instead of using target sentence to attend words in source sentence, we utilized the sentence's first-stage representation to attend words appeared in itself, which is called "Inner-Attention" in our paper . Experiments conducted on Stanford Natural Language Inference (SNLI) Corpus has proved the effectiveness of "Inner-Attention" mechanism. With less number of parameters, our model outperformed the existing best sentence encoding-based approach by a large margin.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
SNLI600D (300+300) BiLSTM encoders with intra-attention and symbolic preproc.% Test Accuracy85Unverified
SNLI600D (300+300) BiLSTM encoders with intra-attention% Test Accuracy84.2Unverified
SNLI600D (300+300) BiLSTM encoders% Test Accuracy83.3Unverified

Reproductions