SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20012010 of 4002 papers

TitleStatusHype
Discretely Coding Semantic Rank Orders for Supervised Image Hashing0
Discrete Wavelet Transform for Efficient Word Embeddings and Sentence Encoding0
Discriminative Acoustic Word Embeddings: Recurrent Neural Network-Based Approaches0
Discriminative Pre-training for Low Resource Title Compression in Conversational Grocery0
Disentangling continuous and discrete linguistic signals in transformer-based sentence embeddings0
Dissecting Contextual Word Embeddings: Architecture and Representation0
Distance Metric Learning for Aspect Phrase Grouping0
Distant Supervision and Noisy Label Learning for Low Resource Named Entity Recognition: A Study on Hausa and Yorùbá0
Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition0
Distilled embedding: non-linear embedding factorization using knowledge distillation0
Show:102550
← PrevPage 201 of 401Next →

No leaderboard results yet.