SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35413550 of 4002 papers

TitleStatusHype
DCC-Uchile at SemEval-2020 Task 1: Temporal Referencing Word Embeddings0
DCU: Using Distributional Semantics and Domain Adaptation for the Semantic Textual Similarity SemEval-2015 Task 20
Debiasing Embeddings for Reduced Gender Bias in Text Classification0
Debiasing Pretrained Text Encoders by Paying Attention to Paying Attention0
Debiasing Word Embeddings Improves Multimodal Machine Translation0
Deceptive Opinion Spam Detection Using Neural Network0
Decoding Brain Activity Associated with Literal and Metaphoric Sentence Comprehension Using Distributional Semantic Models0
Decoding Word Embeddings with Brain-Based Semantic Features0
Decomposing Generalization: Models of Generic, Habitual, and Episodic Statements0
Decomposing Word Embedding with the Capsule Network0
Show:102550
← PrevPage 355 of 401Next →

No leaderboard results yet.