Dynamic Meta-Embeddings for Improved Sentence Representations
2018-04-21EMNLP 2018Code Available0· sign in to hype
Douwe Kiela, Changhan Wang, Kyunghyun Cho
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
While one of the first steps in many NLP systems is selecting what pre-trained word embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to state-of-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new light on the usage of word embeddings in NLP systems.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| SNLI | 512D Dynamic Meta-Embeddings | % Test Accuracy | 86.7 | — | Unverified |