SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 181190 of 4002 papers

TitleStatusHype
Imputing Out-of-Vocabulary Embeddings with LOVE Makes Language Models Robust with Little CostCode1
Imputing Out-of-Vocabulary Embeddings with LOVE Makes LanguageModels Robust with Little CostCode1
iNLTK: Natural Language Toolkit for Indic LanguagesCode1
In Other News: A Bi-style Text-to-speech Model for Synthesizing Newscaster Voice with Limited DataCode1
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic RepresentationsCode1
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for TopicsCode1
Keyword-Guided Neural Conversational ModelCode1
Corrected CBOW Performs as well as Skip-gramCode1
Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense DisambiguationCode1
Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection TaskCode1
Show:102550
← PrevPage 19 of 401Next →

No leaderboard results yet.