SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 7180 of 4002 papers

TitleStatusHype
"This is my unicorn, Fluffy": Personalizing frozen vision-language representationsCode1
VGSE: Visually-Grounded Semantic Embeddings for Zero-Shot LearningCode1
Improving Word Translation via Two-Stage Contrastive LearningCode1
Imputing Out-of-Vocabulary Embeddings with LOVE Makes Language Models Robust with Little CostCode1
Semi-constraint Optimal Transport for Entity Alignment with Dangling CasesCode1
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
Word Embeddings for Automatic Equalization in Audio MixingCode1
HuSpaCy: an industrial-strength Hungarian natural language processing toolkitCode1
Simple, Interpretable and Stable Method for Detecting Words with Usage Change across CorporaCode1
WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language modelsCode1
Show:102550
← PrevPage 8 of 401Next →

No leaderboard results yet.