SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12511260 of 4002 papers

TitleStatusHype
Word associations and the distance properties of context-aware word embeddings0
Understanding the Source of Semantic Regularities in Word Embeddings0
When is a bishop not like a rook? When it's like a rabbi! Multi-prototype BERT embeddings for estimating semantic relationshipsCode0
Neutralizing Gender Bias in Word Embeddings with Latent Disentanglement and Counterfactual Generation0
ESTeR: Combining Word Co-occurrences and Word Associations for Unsupervised Emotion DetectionCode0
Revisiting Representation Degeneration Problem in Language Modeling0
Task-oriented Domain-specific Meta-Embedding for Text Classification0
Alignment-free Cross-lingual Semantic Role Labeling0
Span-based discontinuous constituency parsing: a family of exact chart-based algorithms with time complexities from O(n\^6) down to O(n\^3)0
From Zero to Hero: On the Limitations of Zero-Shot Language Transfer with Multilingual Transformers0
Show:102550
← PrevPage 126 of 401Next →

No leaderboard results yet.