SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28112820 of 4002 papers

TitleStatusHype
Looking Into the Black Box - How Are Idioms Processed in BERT?0
L'optimisation du plongement de mots pour le fran : une application de la classification des phrases (Optimization of Word Embeddings for French : an Application of Sentence Classification)0
Loss Decomposition for Fast Learning in Large Output Spaces0
Lost in Context? On the Sense-wise Variance of Contextualized Word Embeddings0
Low-resource bilingual lexicon extraction using graph based word embeddings0
Low-resource keyword spotting using contrastively trained transformer acoustic word embeddings0
Zero-shot and Few-shot Learning with Knowledge Graphs: A Comprehensive Survey0
Low-Resource Machine Transliteration Using Recurrent Neural Networks of Asian Languages0
LSTM CCG Parsing0
LSTM Easy-first Dependency Parsing with Pre-trained Word Embeddings and Character-level Word Embeddings in Vietnamese0
Show:102550
← PrevPage 282 of 401Next →

No leaderboard results yet.