SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14711480 of 4002 papers

TitleStatusHype
Acoustic Word Embedding System for Code-Switching Query-by-example Spoken Term Detection0
Living Machines: A study of atypical animacyCode0
The Frankfurt Latin Lexicon: From Morphological Expansion and Word Embeddings to SemioGraphsCode0
GM-CTSC at SemEval-2020 Task 1: Gaussian Mixtures Cross Temporal Similarity Clustering0
Enhancing Word Embeddings with Knowledge Extracted from Lexical ResourcesCode0
Embeddings as representation for symbolic music0
Contextual Embeddings: When Are They Worth It?0
Text Classification with Few Examples using Controlled Generalization0
Grammatical gender associations outweigh topical gender bias in crosslinguistic word embeddingsCode0
Adversarial Training for Commonsense InferenceCode1
Show:102550
← PrevPage 148 of 401Next →

No leaderboard results yet.