Denoising Word Embeddings by Averaging in a Shared Space
2021-06-05Joint Conference on Lexical and Computational SemanticsUnverified0· sign in to hype
Avi Caciularu, Ido Dagan, Jacob Goldberger
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We introduce a new approach for smoothing and improving the quality of word embeddings. We consider a method of fusing word embeddings that were trained on the same corpus but with different initializations. We project all the models to a shared vector space using an efficient implementation of the Generalized Procrustes Analysis (GPA) procedure, previously used in multilingual word translation. Our word representation demonstrates consistent improvements over the raw models as well as their simplistic average, on a range of tasks. As the new representations are more stable and reliable, there is a noticeable improvement in rare word evaluations.