SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27112720 of 4002 papers

TitleStatusHype
UniMelb at SemEval-2018 Task 12: Generative Implication using LSTMs, Siamese Networks and Semantic Representations with Synonym Fuzzing0
TAJJEB at SemEval-2018 Task 2: Traditional Approaches Just Do the Job with Emoji Prediction0
THU\_NGN at SemEval-2018 Task 10: Capturing Discriminative Attributes with MLP-CNN model0
ABDN at SemEval-2018 Task 10: Recognising Discriminative Attributes using Context Embeddings and WordNet0
LIS at SemEval-2018 Task 2: Mixing Word Embeddings and Bag of Features for Multilingual Emoji Prediction0
LightRel at SemEval-2018 Task 7: Lightweight and Fast Relation Classification0
GHH at SemEval-2018 Task 10: Discovering Discriminative Attributes in Distributional Semantics0
TeamDL at SemEval-2018 Task 8: Cybersecurity Text Analysis using Convolutional Neural Network and Conditional Random Fields0
Talla at SemEval-2018 Task 7: Hybrid Loss Optimization for Relation Classification using Convolutional Neural Networks0
Tweety at SemEval-2018 Task 2: Predicting Emojis using Hierarchical Attention Neural Networks and Support Vector Machine0
Show:102550
← PrevPage 272 of 401Next →

No leaderboard results yet.