SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12411250 of 4002 papers

TitleStatusHype
Boosting Named Entity Recognition with Neural Character Embeddings0
Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques0
Acoustic Word Embedding System for Code-Switching Query-by-example Spoken Term Detection0
Dyr Bul Shchyl. Proxying Sound Symbolism With Word Embeddings0
Bootstrap Domain-Specific Sentiment Classifiers from Unlabeled Corpora0
Early Discovery of Disappearing Entities in Microblogs0
Earth Mover's Distance Minimization for Unsupervised Bilingual Lexicon Induction0
Easy-First Dependency Parsing with Hierarchical Tree LSTMs0
Bootstrapping NLU Models with Multi-task Learning0
A Framework for Understanding the Role of Morphology in Universal Dependency Parsing0
Show:102550
← PrevPage 125 of 401Next →

No leaderboard results yet.