SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13811390 of 4002 papers

TitleStatusHype
Revisiting Representation Degeneration Problem in Language Modeling0
Evaluating Word Embeddings on Low-Resource Languages0
Evaluating Bias In Dutch Word EmbeddingsCode0
"Thy algorithm shalt not bear false witness": An Evaluation of Multiclass Debiasing Methods on Word EmbeddingsCode0
A Cross-lingual Natural Language Processing Framework for Infodemic Management0
Differential Privacy and Natural Language Processing to Generate Contextually Similar Decoy Messages in Honey Encryption Scheme0
A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models0
Learning Contextual Tag Embeddings for Cross-Modal Alignment of Audio and TagsCode0
Discovering and Interpreting Biased Concepts in Online CommunitiesCode0
Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model0
Show:102550
← PrevPage 139 of 401Next →

No leaderboard results yet.