SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 521530 of 4002 papers

TitleStatusHype
ChatGPT-EDSS: Empathetic Dialogue Speech Synthesis Trained from ChatGPT-derived Context Word Embeddings0
Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine TranslationCode0
Probing Brain Context-Sensitivity with Masked-Attention Generation0
Detecting and Mitigating Indirect Stereotypes in Word Embeddings0
Measuring Intersectional Biases in Historical DocumentsCode0
Controllable Speaking Styles Using a Large Language Model0
A quantitative study of NLP approaches to question difficulty estimationCode0
CWTM: Leveraging Contextualized Word Embeddings from BERT for Neural Topic ModelingCode0
Distilling Semantic Concept Embeddings from Contrastively Fine-Tuned Language ModelsCode0
Frequency-aware Dimension Selection for Static Word Embedding by Mixed Product Distance0
Show:102550
← PrevPage 53 of 401Next →

No leaderboard results yet.