SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27412750 of 4002 papers

TitleStatusHype
Learning Word Representations from Scarce and Noisy Data with Embedding Subspaces0
Learning Word Representations with Regularization from Prior Knowledge0
Learning Word Sense Embeddings from Word Sense Definitions0
Learn Interpretable Word Embeddings Efficiently with von Mises-Fisher Distribution0
Learnt Contrastive Concept Embeddings for Sign Recognition0
Legal Document Classification: An Application to Law Area Prediction of Petitions to Public Prosecution Service0
Legal-ES: A Set of Large Scale Resources for Spanish Legal Text Processing0
Lego: Learning to Disentangle and Invert Personalized Concepts Beyond Object Appearance in Text-to-Image Diffusion Models0
Lessons in Reproducibility: Insights from NLP Studies in Materials Science0
Lessons Learned from Applying off-the-shelf BERT: There is no Silver Bullet0
Show:102550
← PrevPage 275 of 401Next →

No leaderboard results yet.