SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11111120 of 4002 papers

TitleStatusHype
Intrinsic Image Captioning Evaluation0
A comparison of self-supervised speech representations as input features for unsupervised acoustic word embeddings0
Discriminative Pre-training for Low Resource Title Compression in Conversational Grocery0
Improving Zero Shot Learning Baselines with Commonsense Knowledge0
TF-CR: Weighting Embeddings for Text ClassificationCode0
Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies0
A Correspondence Variational Autoencoder for Unsupervised Acoustic Word Embeddings0
SChME at SemEval-2020 Task 1: A Model Ensemble for Detecting Lexical Semantic ChangeCode0
On Extending NLP Techniques from the Categorical to the Latent Space: KL Divergence, Zipf's Law, and Similarity SearchCode0
A Computational Approach to Measuring the Semantic Divergence of Cognates0
Show:102550
← PrevPage 112 of 401Next →

No leaderboard results yet.