SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12711280 of 4002 papers

TitleStatusHype
Differential Privacy and Natural Language Processing to Generate Contextually Similar Decoy Messages in Honey Encryption Scheme0
Named Entity Recognition for Social Media Texts with Semantic AugmentationCode1
Combining Self-Training and Self-Supervised Learning for Unsupervised Disfluency DetectionCode1
A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models0
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT's Gender BiasCode1
Learning Contextualised Cross-lingual Word Embeddings and Alignments for Extremely Low-Resource Languages Using Parallel CorporaCode1
Learning Contextual Tag Embeddings for Cross-Modal Alignment of Audio and TagsCode0
Discovering and Interpreting Biased Concepts in Online CommunitiesCode0
Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model0
Autoencoding Improves Pre-trained Word Embeddings0
Show:102550
← PrevPage 128 of 401Next →

No leaderboard results yet.