SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20912100 of 4002 papers

TitleStatusHype
Residual Stacking of RNNs for Neural Machine Translation0
Resolving Out-of-Vocabulary Words with Bilingual Embeddings in Machine Translation0
Resources to Examine the Quality of Word Embedding Models Trained on n-Gram Data0
Rethinking Topic Modelling: From Document-Space to Term-Space0
Retrieving Multi-Entity Associations: An Evaluation of Combination Modes for Word Embeddings0
Retrofitting Contextualized Word Embeddings with Paraphrases0
Retrofitting of Pre-trained Emotion Words with VAD-dimensions and the Plutchik Emotions0
Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
RETUYT in TASS 2017: Sentiment Analysis for Spanish Tweets using SVM and CNN0
Show:102550
← PrevPage 210 of 401Next →

No leaderboard results yet.