SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29412950 of 4002 papers

TitleStatusHype
Intelligent Word Embeddings of Free-Text Radiology ReportsCode0
Acquiring Common Sense Spatial Knowledge through Implicit Spatial TemplatesCode0
Unsupervised Morphological Expansion of Small Datasets for Improving Word Embeddings0
An Unsupervised Approach for Mapping between Vector Spaces0
Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings0
Convolutional Neural Network with Word Embeddings for Chinese Word SegmentationCode0
Bayesian Paragraph Vectors0
Breaking the Softmax Bottleneck: A High-Rank RNN Language ModelCode0
Learning Multi-Modal Word Representation Grounded in Visual Context0
The Lifted Matrix-Space Model for Semantic CompositionCode0
Show:102550
← PrevPage 295 of 401Next →

No leaderboard results yet.