SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 691700 of 4002 papers

TitleStatusHype
Discrete Wavelet Transform for Efficient Word Embeddings and Sentence Encoding0
Unsupervised Domain Adaptation with Contrastive Learning for Cross-domain Chinese NER0
Cross-lingual Word Embeddings in Hyperbolic Space0
Improving Word Translation via Two-Stage Contrastive LearningCode1
Crossword: Estimating Unknown Embeddings using Cross Attention and Alignment Strategies0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
Non-Linear Relational Information Probing in Word Embeddings0
FeelsGoodMan: Inferring Semantics of Twitch Neologisms0
Looking Into the Black Box - How Are Idioms Processed in BERT?0
Show:102550
← PrevPage 70 of 401Next →

No leaderboard results yet.