SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 801810 of 4002 papers

TitleStatusHype
Contrastive Word Embedding Learning for Neural Machine Translation0
Gender Roles from Word Embeddings in a Century of Children’s Books0
Task-adaptive Pre-training of Language Models with Word Embedding Regularization0
Revisiting Tri-training of Dependency ParsersCode0
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification0
Evaluating Biomedical BERT Models for Vocabulary Alignment at Scale in the UMLS Metathesaurus0
InceptionXML: A Lightweight Framework with Synchronized Negative Sampling for Short Text Extreme ClassificationCode0
Assessing the Reliability of Word Embedding Gender Bias MeasuresCode0
Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous GraphCode1
ArGoT: A Glossary of Terms extracted from the arXiv0
Show:102550
← PrevPage 81 of 401Next →

No leaderboard results yet.