SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 781790 of 4002 papers

TitleStatusHype
Phonetic Word EmbeddingsCode1
DICoE@FinSim-3: Financial Hypernym Detection using Augmented Terms and Distance-based Features0
Multi-granular Legal Topic Classification on Greek LegislationCode0
JOINTLY LEARNING TOPIC SPECIFIC WORD AND DOCUMENT EMBEDDING0
Debiasing Pretrained Text Encoders by Paying Attention to Paying Attention0
Reconstructing Word Embeddings via Scattered k-Sub-Embedding0
Graph-based Nearest Neighbor Search in Hyperbolic Spaces0
Antonymy-Synonymy Discrimination through the Repelling Parasiamese Neural Network0
EDGAR-CORPUS: Billions of Tokens Make The World Go Round0
Marked Attribute Bias in Natural Language InferenceCode0
Show:102550
← PrevPage 79 of 401Next →

No leaderboard results yet.