SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 811820 of 4002 papers

TitleStatusHype
Deep Pivot-Based Modeling for Cross-language Cross-domain Transfer with Minimal GuidanceCode0
A General Framework for Implicit and Explicit Debiasing of Distributional Word Vector SpacesCode0
Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain ResponsesCode0
Deep Unordered Composition Rivals Syntactic Methods for Text ClassificationCode0
Beyond Word2Vec: Embedding Words and Phrases in Same Vector SpaceCode0
Assessing the Reliability of Word Embedding Gender Bias MeasuresCode0
A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown DetectionCode0
Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word EmbeddingsCode0
Assessing Wordnets with WordNet EmbeddingsCode0
Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine TranslationCode0
Show:102550
← PrevPage 82 of 401Next →

No leaderboard results yet.