SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34313440 of 4002 papers

TitleStatusHype
Compressing Word Embeddings0
Compressing Word Embeddings Using Syllables0
Compression of Generative Pre-trained Language Models via Quantization0
Computational Detection of Intertextual Parallels in Biblical Hebrew: A Benchmark Study Using Transformer-Based Language Models0
Computationally Constructed Concepts: A Machine Learning Approach to Metaphor Interpretation Using Usage-Based Construction Grammatical Cues0
Conceptor Debiasing of Word Representations Evaluated on WEAT0
Concept Space Alignment in Multilingual LLMs0
Conceptual Cognitive Maps Formation with Neural Successor Networks and Word Embeddings0
Conditional Generative Adversarial Networks for Emoji Synthesis with Word Embedding Manipulation0
Conditional Random Fields for Metaphor Detection0
Show:102550
← PrevPage 344 of 401Next →

No leaderboard results yet.