SOTAVerified

Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

2019-04-11SEMEVAL 2019Unverified0· sign in to hype

Hanan Aldarmaki, Mona Diab

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We develop and investigate several cross-lingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.

Tasks

Reproductions