SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28712880 of 4002 papers

TitleStatusHype
A Comparison of Domain-based Word Polarity Estimation using different Word Embeddings0
A comparison of self-supervised speech representations as input features for unsupervised acoustic word embeddings0
A Comparison of Word2Vec, HMM2Vec, and PCA2Vec for Malware Classification0
A Comparison of Word Embeddings for English and Cross-Lingual Chinese Word Sense Disambiguation0
A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models0
A Computational Approach to Measuring the Semantic Divergence of Cognates0
A Correspondence Variational Autoencoder for Unsupervised Acoustic Word Embeddings0
Acoustically Grounded Word Embeddings for Improved Acoustics-to-Word Speech Recognition0
Acoustic Word Embeddings for Untranscribed Target Languages with Continued Pretraining and Learned Pooling0
Acoustic Word Embedding System for Code-Switching Query-by-example Spoken Term Detection0
Show:102550
← PrevPage 288 of 401Next →

No leaderboard results yet.