SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 761770 of 4002 papers

TitleStatusHype
Inter-Sense: An Investigation of Sensory Blending in Fiction0
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings0
WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models0
Cooperative Semi-Supervised Transfer Learning of Machine Reading Comprehension0
Subword-based Cross-lingual Transfer of Embeddings from Hindi to Marathi0
Tracing Origins: Coreference-aware Machine Reading ComprehensionCode1
Large Scale Substitution-based Word Sense Induction0
Evaluating Off-the-Shelf Machine Listening and Natural Language Models for Automated Audio Captioning0
BI-RADS BERT & Using Section Segmentation to Understand Radiology ReportsCode0
Regionalized models for Spanish language variations based on Twitter0
Show:102550
← PrevPage 77 of 401Next →

No leaderboard results yet.