SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 791800 of 4002 papers

TitleStatusHype
Identifying and Mitigating Gender Bias in Hyperbolic Word Embeddings0
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding SpacesCode0
Lacking the embedding of a word? Look it up into a traditional dictionary0
How Familiar Does That Sound? Cross-Lingual Representational Similarity Analysis of Acoustic Word EmbeddingsCode0
InvBERT: Reconstructing Text from Contextualized Word Embeddings by inverting the BERT pipeline0
DisCoDisCo at the DISRPT2021 Shared Task: A System for Discourse Segmentation, Classification, and Connective DetectionCode0
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language ModelsCode1
Conditional probing: measuring usable information beyond a baselineCode1
Fast query-by-example speech search using separable model0
Augmenting semantic lexicons using word embeddings and transfer learningCode0
Show:102550
← PrevPage 80 of 401Next →

No leaderboard results yet.