SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34013410 of 4002 papers

TitleStatusHype
Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word VectorsCode0
Tracing Antisemitic Language Through Diachronic Embedding Projections: France 1789-1914Code0
Beyond Word2Vec: Embedding Words and Phrases in Same Vector SpaceCode0
Collocation Classification with Unsupervised Relation VectorsCode0
Nonparametric Spherical Topic Modeling with Word EmbeddingsCode0
Do We Really Need All Those Rich Linguistic Features? A Neural Network-Based Approach to Implicit Sense LabelingCode0
Self-Governing Neural Networks for On-Device Short Text ClassificationCode0
Do Word Embeddings Capture Spelling Variation?Code0
GNTeam at 2018 n2c2: Feature-augmented BiLSTM-CRF for drug-related entity recognition in hospital discharge summariesCode0
Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine TranslationCode0
Show:102550
← PrevPage 341 of 401Next →

No leaderboard results yet.