SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 691700 of 4002 papers

TitleStatusHype
Gender Bias in Word Embeddings: A Comprehensive Analysis of Frequency, Syntax, and Semantics0
Comparing Performance of Different Linguistically-Backed Word Embeddings for Cyberbullying Detection0
Measuring Gender Bias in Word Embeddings of Gendered Languages Requires Disentangling Grammatical Gender SignalsCode0
Tracking Changes in ESG Representation: Initial Investigations in UK Annual Reports0
Automating Idea Unit Segmentation and Alignment for Assessing Reading Comprehension via Summary Protocol Analysis0
HECTOR: A Hybrid TExt SimplifiCation TOol for Raw Texts in French0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
Metaphor Detection for Low Resource Languages: From Zero-Shot to Few-Shot Learning in Middle High GermanCode0
Dialects Identification of Armenian Language0
Query Obfuscation by Semantic Decomposition0
Show:102550
← PrevPage 70 of 401Next →

No leaderboard results yet.