Conception: Multilingually-Enhanced, Human-Readable Concept Vector Representations
Simone Conia, Roberto Navigli
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/sapienzanlp/conceptionOfficialIn papernone★ 11
Abstract
To date, the most successful word, word sense, and concept modelling techniques have used large corpora and knowledge resources to produce dense vector representations that capture semantic similarities in a relatively low-dimensional space. Most current approaches, however, suffer from a monolingual bias, with their strength depending on the amount of data available across languages. In this paper we address this issue and propose Conception, a novel technique for building language-independent vector representations of concepts which places multilinguality at its core while retaining explicit relationships between concepts. Our approach results in high-coverage representations that outperform the state of the art in multilingual and cross-lingual Semantic Word Similarity and Word Sense Disambiguation, proving particularly robust on low-resource languages. Conception -- its software and the complete set of representations -- is available at https://github.com/SapienzaNLP/conception.