SOTAVerified

Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation

2019-11-01IJCNLP 2019Unverified0· sign in to hype

Yogarshi Vyas, Marine Carpuat

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Words in different languages rarely cover the exact same semantic space. This work characterizes differences in meaning between words across languages using semantic relations that have been used to relate the meaning of English words. However, because of translation ambiguity, semantic relations are not always preserved by translation. We introduce a cross-lingual relation classifier trained only with English examples and a bilingual dictionary. Our classifier relies on a novel attention-based distillation approach to account for translation ambiguity when transferring knowledge from English to cross-lingual settings. On new English-Chinese and English-Hindi test sets, the resulting models largely outperform baselines that more naively rely on bilingual embeddings or dictionaries for cross-lingual transfer, and approach the performance of fully supervised systems on English tasks.

Tasks

Reproductions