SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21612170 of 4002 papers

TitleStatusHype
An Empirical Study on Post-processing Methods for Word Embeddings0
Sherlock: A Deep Learning Approach to Semantic Data Type DetectionCode0
Self-supervised audio representation learning for mobile devices0
Debiasing Word Embeddings Improves Multimodal Machine Translation0
Subspace Detours: Building Transport Plans that are Optimal on Subspace ProjectionsCode0
Fair is Better than Sensational:Man is to Doctor as Woman is to DoctorCode0
Misspelling Oblivious Word EmbeddingsCode0
GWU NLP Lab at SemEval-2019 Task 3: EmoContext: Effective Contextual Information in Models for Emotion Detection in Sentence-level in a Multigenre Corpus0
Action Assembly: Sparse Imitation Learning for Text Based Games with Combinatorial Action Spaces0
Augmenting Data with Mixup for Sentence Classification: An Empirical StudyCode0
Show:102550
← PrevPage 217 of 401Next →

No leaderboard results yet.