SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 231240 of 4002 papers

TitleStatusHype
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
Towards Debiasing Sentence RepresentationsCode1
Tracing Origins: Coreference-aware Machine Reading ComprehensionCode1
Embed2Detect: Temporally Clustered Embedded Words for Event Detection in Social MediaCode1
TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-rankingCode1
Two-Level Transformer and Auxiliary Coherence Modeling for Improved Text SegmentationCode1
UmlsBERT: Clinical Domain Knowledge Augmentation of Contextual Embeddings Using the Unified Medical Language System MetathesaurusCode1
Understanding and Improving Encoder Layer Fusion in Sequence-to-Sequence LearningCode1
FAME: Feature-Based Adversarial Meta-Embeddings for Robust Input RepresentationsCode1
Learning principled bilingual mappings of word embeddings while preserving monolingual invarianceCode1
Show:102550
← PrevPage 24 of 401Next →

No leaderboard results yet.