SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23412350 of 4002 papers

TitleStatusHype
Gender and Racial Stereotype Detection in Legal Opinion Word Embeddings0
Gender bias Evaluation in Luganda-English Machine Translation0
Gender Bias Hidden Behind Chinese Word Embeddings: The Case of Chinese Adjectives0
Gender Bias in Word Embeddings: A Comprehensive Analysis of Frequency, Syntax, and Semantics0
Gender Prediction for Chinese Social Media Data0
Gender Roles from Word Embeddings in a Century of Children’s Books0
Generating Adequate Distractors for Multiple-Choice Questions0
Generating Varied Training Corpora in Runyankore Using a Combined Semantic and Syntactic, Pattern-Grammar-based Approach0
Generic and Specialized Word Embeddings for Multi-Domain Machine Translation0
Generic Embedding-Based Lexicons for Transparent and Reproducible Text Scoring0
Show:102550
← PrevPage 235 of 401Next →

No leaderboard results yet.