SOTAVerified

How does Punctuation Affect Neural Models in Natural Language Inference

2020-06-01PaM 2020Unverified0· sign in to hype

Adam Ek, Jean-Philippe Bernardy, Stergios Chatzikyriakidis

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Natural Language Inference models have reached almost human-level performance but their generalisation capabilities have not been yet fully characterized. In particular, sensitivity to small changes in the data is a current area of investigation. In this paper, we focus on the effect of punctuation on such models. Our findings can be broadly summarized as follows: (1) irrelevant changes in punctuation are correctly ignored by the recent transformer models (BERT) while older RNN-based models were sensitive to them. (2) All models, both transformers and RNN-based models, are incapable of taking into account small relevant changes in the punctuation.

Tasks

Reproductions