SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21112120 of 4002 papers

TitleStatusHype
Modeling Personal Biases in Language Use by Inducing Personalized Word Embeddings0
Incorporating Emoji Descriptions Improves Tweet Classification0
Text Similarity Estimation Based on Word Embeddings and Matrix Norms for Targeted Marketing0
Know-Center at SemEval-2019 Task 5: Multilingual Hate Speech Detection on Twitter using CNNs0
EmoDet at SemEval-2019 Task 3: Emotion Detection in Text using Deep Learning0
EMOMINER at SemEval-2019 Task 3: A Stacked BiLSTM Architecture for Contextual Emotion Detection in Text0
Learning Bilingual Sentiment-Specific Word Embeddings without Cross-lingual Supervision0
EmoSense at SemEval-2019 Task 3: Bidirectional LSTM Network for Contextual Emotion Detection in Textual ConversationsCode0
JU\_ETCE\_17\_21 at SemEval-2019 Task 6: Efficient Machine Learning and Neural Network Approaches for Identifying and Categorizing Offensive Language in TweetsCode0
CLaC Lab at SemEval-2019 Task 3: Contextual Emotion Detection Using a Combination of Neural Networks and SVM0
Show:102550
← PrevPage 212 of 401Next →

No leaderboard results yet.