SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 110 of 4002 papers

TitleStatusHype
Fine-mixing: Mitigating Backdoors in Fine-tuned Language ModelsCode8
CharacterFactory: Sampling Consistent Characters with GANs for Diffusion ModelsCode3
WSI-VQA: Interpreting Whole Slide Images by Generative Visual Question AnsweringCode2
FASTopic: Pretrained Transformer is a Fast, Adaptive, Stable, and Transferable Topic ModelCode2
VNLP: Turkish NLP PackageCode2
Generative Adversarial Training for Text-to-Speech Synthesis Based on Raw Phonetic Input and Explicit Prosody ModellingCode2
RETVec: Resilient and Efficient Text VectorizerCode2
Contextual Semantic Embeddings for Ontology Subsumption PredictionCode2
Train Short, Test Long: Attention with Linear Biases Enables Input Length ExtrapolationCode2
A Pilot Study for Chinese SQL Semantic ParsingCode2
Show:102550
← PrevPage 1 of 401Next →

No leaderboard results yet.