SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33213330 of 4002 papers

TitleStatusHype
NILC-USP at SemEval-2017 Task 4: A Multi-view Ensemble for Twitter Sentiment AnalysisCode0
The Interplay of Semantics and Morphology in Word EmbeddingsCode0
Linear Ensembles of Word Embedding ModelsCode0
Fortia-FBK at SemEval-2017 Task 5: Bullish or Bearish? Inferring Sentiment towards Brands from Financial News Headlines0
Word-Alignment-Based Segment-Level Machine Translation Evaluation using Word Embeddings0
Word Sense Filtering Improves Embedding-Based Lexical Substitution0
Clustering of Russian Adjective-Noun Constructions using Word Embeddings0
An RNN-based Binary Classifier for the Story Cloze Test0
Improving Verb Metaphor Detection by Propagating Abstractness to Words, Phrases and Individual Senses0
Using bilingual word-embeddings for multilingual collocation extraction0
Show:102550
← PrevPage 333 of 401Next →

No leaderboard results yet.