SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21912200 of 4002 papers

TitleStatusHype
Distribution is not enough: going Firther0
Nested Variational Autoencoder for Topic Modeling on Microtexts with Word VectorsCode0
Wasserstein Barycenter Model Ensembling0
Zero-training Sentence Embedding via Orthogonal BasisCode0
RelWalk -- A Latent Variable Model Approach to Knowledge Graph Embedding0
The Effectiveness of Pre-Trained Code Embeddings0
Unsupervised Hyper-alignment for Multilingual Word Embeddings0
Encoding Category Trees Into Word-Embeddings Using Geometric ApproachCode0
Learning Mixed-Curvature Representations in Product Spaces0
Poincare Glove: Hyperbolic Word EmbeddingsCode0
Show:102550
← PrevPage 220 of 401Next →

No leaderboard results yet.