SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38213830 of 4002 papers

TitleStatusHype
Word Embeddings (Also) Encode Human Personality StereotypesCode0
Generalizing Word Embeddings using Bag of SubwordsCode0
Bad Company---Neighborhoods in Neural Embedding Spaces Considered HarmfulCode0
Generating Fact Checking Summaries for Web ClaimsCode0
Generating Sense Embeddings for Syntactic and Semantic Analogy for PortugueseCode0
Generating Text through Adversarial Training using Skip-Thought VectorsCode0
Social Bias in Elicited Natural Language InferencesCode0
Generative Adversarial Nets for Multiple Text CorporaCode0
Generative Adversarial Networks for text using word2vec intermediariesCode0
CEA LIST: Processing Low-Resource Languages for CoNLL 2018Code0
Show:102550
← PrevPage 383 of 401Next →

No leaderboard results yet.