SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19811990 of 4002 papers

TitleStatusHype
Give It a Shot: Few-shot Learning to Normalize ADR Mentions in Social Media Posts0
Composing Noun Phrase Vector Representations0
Exploring the Use of Lexicons to aid Deep Learning towards the Detection of Abusive Language0
Measuring Gender Bias in Word Embeddings across Domains and Discovering New Gender Bias Word CategoriesCode0
Relating Word Embedding Gender Biases to Gender Gaps: A Cross-Cultural Analysis0
Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques0
Learning Word Embeddings without Context Vectors0
A Platform Agnostic Dual-Strand Hate Speech Detector0
Effective Dimensionality Reduction for Word EmbeddingsCode0
NLP@UNED at SMM4H 2019: Neural Networks Applied to Automatic Classifications of Adverse Effects Mentions in Tweets0
Show:102550
← PrevPage 199 of 401Next →

No leaderboard results yet.