SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 211220 of 4002 papers

TitleStatusHype
Paraphrase Generation with Latent Bag of WordsCode1
Two-Level Transformer and Auxiliary Coherence Modeling for Improved Text SegmentationCode1
TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-rankingCode1
DeFINE: DEep Factorized INput Token Embeddings for Neural Sequence ModelingCode1
Improving Document Classification with Multi-Sense EmbeddingsCode1
Structured Pruning of Large Language ModelsCode1
FreeLB: Enhanced Adversarial Training for Natural Language UnderstandingCode1
Topic Modeling in Embedding SpacesCode1
Word Embeddings for the Analysis of Ideological Placement in Parliamentary CorporaCode1
Unsupervised Multilingual Word Embedding with Limited Resources using Neural Language ModelsCode1
Show:102550
← PrevPage 22 of 401Next →

No leaderboard results yet.