SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38313840 of 4002 papers

TitleStatusHype
Generative Question Refinement with Deep Reinforcement Learning in Retrieval-based QA SystemCode0
Measuring Gender Bias in Word Embeddings across Domains and Discovering New Gender Bias Word CategoriesCode0
Measuring Gender Bias in Word Embeddings of Gendered Languages Requires Disentangling Grammatical Gender SignalsCode0
Measuring Intersectional Biases in Historical DocumentsCode0
Social Emotion Mining Techniques for Facebook Posts Reaction PredictionCode0
Measuring Semantic Similarity of Words Using Concept NetworksCode0
Geological Inference from Textual Data using Word EmbeddingsCode0
Uncovering divergent linguistic information in word embeddings with lessons for intrinsic and extrinsic evaluationCode0
Geometry of CompositionalityCode0
Measuring Social Biases in Grounded Vision and Language EmbeddingsCode0
Show:102550
← PrevPage 384 of 401Next →

No leaderboard results yet.