SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 231240 of 4002 papers

TitleStatusHype
UiO-UvA at SemEval-2020 Task 1: Contextualised Embeddings for Lexical Semantic Change DetectionCode1
UmlsBERT: Clinical Domain Knowledge Augmentation of Contextual Embeddings Using the Unified Medical Language System MetathesaurusCode1
Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon InductionCode1
Combining Self-Training and Self-Supervised Learning for Unsupervised Disfluency DetectionCode1
Compass-aligned Distributional Embeddings for Studying Semantic Differences across CorporaCode1
Universal Sentence EncoderCode1
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT’s Gender BiasCode1
Comparative Evaluation of Pretrained Transfer Learning Models on Automatic Short Answer GradingCode1
FAME: Feature-Based Adversarial Meta-Embeddings for Robust Input RepresentationsCode1
Multilingual Music Genre Embeddings for Effective Cross-Lingual Music Item AnnotationCode1
Show:102550
← PrevPage 24 of 401Next →

No leaderboard results yet.