SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28012810 of 4002 papers

TitleStatusHype
WordRep: A Benchmark for Research on Learning Word Representations0
Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs0
Word Semantic Representations using Bayesian Probabilistic Tensor Factorization0
Word Sense Disambiguation for 158 Languages using Word Embeddings Only0
Word Sense Distance in Human Similarity Judgements and Contextualised Word Embeddings0
Word Sense Filtering Improves Embedding-Based Lexical Substitution0
Word Sense Induction using Knowledge Embeddings0
Word sense induction using word embeddings and community detection in complex networks0
Word Sense Induction with Knowledge Distillation from BERT0
Word-specific tonal realizations in Mandarin0
Show:102550
← PrevPage 281 of 401Next →

No leaderboard results yet.