SOTAVerified

Language Embeddings for Typology and Cross-lingual Transfer Learning

2021-06-03ACL 2021Code Available0· sign in to hype

Dian Yu, Taiqi He, Kenji Sagae

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Cross-lingual language tasks typically require a substantial amount of annotated data or parallel translation data. We explore whether language representations that capture relationships among languages can be learned and subsequently leveraged in cross-lingual tasks without the use of parallel data. We generate dense embeddings for 29 languages using a denoising autoencoder, and evaluate the embeddings using the World Atlas of Language Structures (WALS) and two extrinsic tasks in a zero-shot setting: cross-lingual dependency parsing and cross-lingual natural language inference.

Tasks

Reproductions