SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21312140 of 4002 papers

TitleStatusHype
DS at SemEval-2019 Task 9: From Suggestion Mining with neural networks to adversarial cross-domain classification0
SWAP at SemEval-2019 Task 3: Emotion detection in conversations through Tweets, CNN and LSTM deep neural networksCode0
GL at SemEval-2019 Task 5: Identifying hateful tweets with a deep learning approach.0
Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis0
SINAI-DL at SemEval-2019 Task 5: Recurrent networks and data augmentation by paraphrasing0
Combining Discourse Markers and Cross-lingual Embeddings for Synonym--Antonym Classification0
UPV-28-UNITO at SemEval-2019 Task 7: Exploiting Post's Nesting and Syntax Information for Rumor Stance Classification0
USF at SemEval-2019 Task 6: Offensive Language Detection Using LSTM With Word Embeddings0
Subword-based Compact Reconstruction of Word EmbeddingsCode0
STUFIIT at SemEval-2019 Task 5: Multilingual Hate Speech Detection on Twitter with MUSE and ELMo Embeddings0
Show:102550
← PrevPage 214 of 401Next →

No leaderboard results yet.