SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 151160 of 4002 papers

TitleStatusHype
Enhancing Scalability of Metric Differential Privacy via Secret Dataset Partitioning and Benders Decomposition0
DALLMi: Domain Adaption for LLM-based Multi-label ClassifierCode0
1-Diffractor: Efficient and Utility-Preserving Text Obfuscation Leveraging Word-Level Metric Differential PrivacyCode0
CharacterFactory: Sampling Consistent Characters with GANs for Diffusion ModelsCode3
Bridging Vision and Language Spaces with Assignment PredictionCode0
WordDecipher: Enhancing Digital Workspace Communication with Explainable AI for Non-native English Speakers0
Weakly Supervised Deep Hyperspherical Quantization for Image RetrievalCode0
IITK at SemEval-2024 Task 1: Contrastive Learning and Autoencoders for Semantic Textual Relatedness in Multilingual TextsCode0
BanglaAutoKG: Automatic Bangla Knowledge Graph Construction with Semantic Neural Graph FilteringCode0
Robust Concept Erasure Using Task Vectors0
Show:102550
← PrevPage 16 of 401Next →

No leaderboard results yet.