SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26912700 of 4002 papers

TitleStatusHype
Meaning\_space at SemEval-2018 Task 10: Combining explicitly encoded knowledge with information extracted from word embeddings0
ELiRF-UPV at SemEval-2018 Task 11: Machine Comprehension using Commonsense Knowledge0
EICA Team at SemEval-2018 Task 2: Semantic and Metadata-based Features for Multilingual Emoji Prediction0
UNBNLP at SemEval-2018 Task 10: Evaluating unsupervised approaches to capturing discriminative attributes0
UMD at SemEval-2018 Task 10: Can Word Embeddings Capture Discriminative Attributes?0
How Gender and Skin Tone Modifiers Affect Emoji Semantics in TwitterCode0
CSReader at SemEval-2018 Task 11: Multiple Choice Question Answering as Textual Entailment0
300-sparsans at SemEval-2018 Task 9: Hypernymy as interaction of sparse attributes0
UMDuluth-CS8761 at SemEval-2018 Task9: Hypernym Discovery using Hearst Patterns, Co-occurrence frequencies and Word Embeddings0
BLCU\_NLP at SemEval-2018 Task 12: An Ensemble Model for Argument Reasoning Based on Hierarchical Attention0
Show:102550
← PrevPage 270 of 401Next →

No leaderboard results yet.