SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24612470 of 4002 papers

TitleStatusHype
Syntax Encoding with Application in Authorship Attribution0
CARER: Contextualized Affect Representations for Emotion RecognitionCode0
Improved Dependency Parsing using Implicit Word Connections Learned from Unlabeled Data0
Word Embeddings for Code-Mixed Language Processing0
Learning Unsupervised Word Translations Without Adversaries0
How to represent a word and predict it, too: Improving tied architectures for language modelling0
Streaming word similarity mining on the cheap0
Refining Pretrained Word Embeddings Using Layer-wise Relevance Propagation0
A Neural Local Coherence Model for Text Quality Assessment0
Spot the Odd Man Out: Exploring the Associative Power of Lexical Resources0
Show:102550
← PrevPage 247 of 401Next →

No leaderboard results yet.