SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 6170 of 4002 papers

TitleStatusHype
Disentangling Visual Embeddings for Attributes and ObjectsCode1
Recovering Private Text in Federated Learning of Language ModelsCode1
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic RepresentationsCode1
Word Tour: One-dimensional Word Embeddings via the Traveling Salesman ProblemCode1
Hyperbolic Relevance Matching for Neural Keyphrase ExtractionCode1
Learning Bias-reduced Word Embeddings Using Dictionary DefinitionsCode1
Imputing Out-of-Vocabulary Embeddings with LOVE Makes LanguageModels Robust with Little CostCode1
Emotion-Aware Transformer Encoder for Empathetic Dialogue GenerationCode1
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for TopicsCode1
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
Show:102550
← PrevPage 7 of 401Next →

No leaderboard results yet.