SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13011310 of 4002 papers

TitleStatusHype
Detecting weak and strong Islamophobic hate speech on social media0
Embodying Pre-Trained Word Embeddings Through Robot Actions0
Embracing Non-Traditional Linguistic Resources for Low-resource Language Name Tagging0
Emerging Cross-lingual Structure in Pretrained Language Models0
EmoDet at SemEval-2019 Task 3: Emotion Detection in Text using Deep Learning0
Building Semantic Grams of Human Knowledge0
EMOMINER at SemEval-2019 Task 3: A Stacked BiLSTM Architecture for Contextual Emotion Detection in Text0
EmoNLP at IEST 2018: An Ensemble of Deep Learning Models and Gradient Boosting Regression Tree for Implicit Emotion Prediction in Tweets0
Applying Multi-Sense Embeddings for German Verbs to Determine Semantic Relatedness and to Detect Non-Literal Language0
Better Word Representations with Recursive Neural Networks for Morphology0
Show:102550
← PrevPage 131 of 401Next →

No leaderboard results yet.