SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25212530 of 4002 papers

TitleStatusHype
Tweester at SemEval-2016 Task 4: Sentiment Analysis in Twitter Using Semantic-Affective Model Adaptation0
Tweester at SemEval-2017 Task 4: Fusion of Semantic-Affective and pairwise classification models for sentiment analysis in Twitter0
Tweets Sentiment Analysis via Word Embeddings and Machine Learning Techniques0
Tweety at SemEval-2018 Task 2: Predicting Emojis using Hierarchical Attention Neural Networks and Support Vector Machine0
TwiSe at SemEval-2017 Task 4: Five-point Twitter Sentiment Classification and Quantification0
Twitter Bot Detection Using Bidirectional Long Short-term Memory Neural Networks and Word Embeddings0
Two Stages Approach for Tweet Engagement Prediction0
UAlberta at SemEval-2020 Task 2: Using Translations to Predict Cross-Lingual Entailment0
UDPipe 2.0 Prototype at CoNLL 2018 UD Shared Task0
UINSUSKA-TiTech at SemEval-2017 Task 3: Exploiting Word Importance Levels for Similarity Features for CQA0
Show:102550
← PrevPage 253 of 401Next →

No leaderboard results yet.