SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 971980 of 4002 papers

TitleStatusHype
Assessing the Reliability of Word Embedding Gender Bias MeasuresCode0
Rare Tokens Degenerate All Tokens: Improving Neural Text Generation via Adaptive Gradient Gating for Rare Token Embeddings0
ArGoT: A Glossary of Terms extracted from the arXiv0
Self-Supervised Detection of Contextual Synonyms in a Multi-Class Setting: Phenotype Annotation Use Case0
An Exploratory Study on Utilising the Web of Linked Data for Product Data Mining0
An Empirical Study on Leveraging Position Embeddings for Target-oriented Opinion Words ExtractionCode0
Automatic Transformation of Clinical Narratives into Structured Format0
The Impact of Word Embeddings on Neural Dependency Parsing0
Word Discriminations for Vocabulary Inventory PredictionCode0
Position Masking for Improved Layout-Aware Document Understanding0
Show:102550
← PrevPage 98 of 401Next →

No leaderboard results yet.