SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20762100 of 4002 papers

TitleStatusHype
Rehabilitation of Count-based Models for Word Vector Representations0
Reinforcing the Topic of Embeddings with Theta Pure Dependence for Text Classification0
Relating Word Embedding Gender Biases to Gender Gaps: A Cross-Cultural Analysis0
Relation Extraction Datasets in the Digital Humanities Domain and their Evaluation with Word Embeddings0
Relation Extraction: Perspective from Convolutional Neural Networks0
Relation Induction in Word Embeddings Revisited0
RelWalk -- A Latent Variable Model Approach to Knowledge Graph Embedding0
RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding0
R\'epliquer et \'etendre pour l'alsacien ``\'Etiquetage en parties du discours de langues peu dot\'ees par sp\'ecialisation des plongements lexicaux'' (Replicating and extending for Alsatian : ``POS tagging for low-resource languages by adapting word embeddings'')0
Representation Learning for Unseen Words by Bridging Subwords to Semantic Networks0
Representations of Time Expressions for Temporal Relation Extraction with Convolutional Neural Networks0
Representing Affect Information in Word Embeddings0
Representing Support Verbs in FrameNet0
Reranking Translation Candidates Produced by Several Bilingual Word Similarity Sources0
Research on Multilingual News Clustering Based on Cross-Language Word Embeddings0
Residual Stacking of RNNs for Neural Machine Translation0
Resolving Out-of-Vocabulary Words with Bilingual Embeddings in Machine Translation0
Resources to Examine the Quality of Word Embedding Models Trained on n-Gram Data0
Rethinking Topic Modelling: From Document-Space to Term-Space0
Retrieving Multi-Entity Associations: An Evaluation of Combination Modes for Word Embeddings0
Retrofitting Contextualized Word Embeddings with Paraphrases0
Retrofitting of Pre-trained Emotion Words with VAD-dimensions and the Plutchik Emotions0
Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
RETUYT in TASS 2017: Sentiment Analysis for Spanish Tweets using SVM and CNN0
Show:102550
← PrevPage 84 of 161Next →

No leaderboard results yet.