SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22512260 of 4002 papers

TitleStatusHype
Question Embeddings Based on Shannon Entropy: Solving intent classification task in goal-oriented dialogue systemCode0
On Measuring Social Biases in Sentence EncodersCode0
Expanding the Text Classification Toolbox with Cross-Lingual Embeddings0
LINSPECTOR: Multilingual Probing Tasks for Word RepresentationsCode0
Learning Entity Representations for Few-Shot Reconstruction of Wikipedia Categories0
Personalized Neural Embeddings for Collaborative Filtering with Text0
ETNLP: a visual-aided systematic approach to select pre-trained embeddings for a downstream taskCode0
Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove ThemCode0
Context-Aware Cross-Lingual MappingCode0
Creation and Evaluation of Datasets for Distributional Semantics Tasks in the Digital Humanities Domain0
Show:102550
← PrevPage 226 of 401Next →

No leaderboard results yet.