SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23712380 of 4002 papers

TitleStatusHype
Subword Encoding in Lattice LSTM for Chinese Word SegmentationCode0
Magnitude: A Fast, Efficient Universal Vector Embedding Utility PackageCode0
Learning Emotion from 100 Observations: Unexpected Robustness of Deep Learning under Strong Data Limitations0
Local Homology of Word Embeddings0
Word Sense Induction using Knowledge Embeddings0
Time-Aware and Corpus-Specific Entity Relatedness0
Exponential Family Word Embeddings: An Iterative Approach for Learning Word Vectors0
Learned in Speech Recognition: Contextual Acoustic Word Embeddings0
Interpreting Word Embeddings with Eigenvector AnalysisCode0
BioSentVec: creating sentence embeddings for biomedical textsCode0
Show:102550
← PrevPage 238 of 401Next →

No leaderboard results yet.