SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27212730 of 4002 papers

TitleStatusHype
NEUROSENT-PDI at SemEval-2018 Task 7: Discovering Textual Relations With a Neural Network Model0
NEUROSENT-PDI at SemEval-2018 Task 3: Understanding Irony in Social Networks Through a Multi-Domain Sentiment Model0
TakeLab at SemEval-2018 Task12: Argument Reasoning Comprehension with Skip-Thought Vectors0
ADAPT at SemEval-2018 Task 9: Skip-Gram Word Embeddings for Unsupervised Hypernym Discovery in Specialised Corpora0
Random Decision Syntax Trees at SemEval-2018 Task 3: LSTMs and Sentiment Scores for Irony Detection0
Towards Qualitative Word Embeddings Evaluation: Measuring Neighbors Variation0
attr2vec: Jointly Learning Word and Contextual Attribute Embeddings with Factorization Machines0
Evaluating bilingual word embeddings on the long tailCode0
T\"ubingen-Oslo at SemEval-2018 Task 2: SVMs perform better than RNNs in Emoji Prediction0
Solving Data Sparsity for Aspect Based Sentiment Analysis Using Cross-Linguality and Multi-Linguality0
Show:102550
← PrevPage 273 of 401Next →

No leaderboard results yet.