SOTAVerified

Which *BERT? A Survey Organizing Contextualized Encoders

2020-10-02EMNLP 2020Unverified0· sign in to hype

Patrick Xia, Shijie Wu, Benjamin Van Durme

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

Tasks

Reproductions