SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23312340 of 4002 papers

TitleStatusHype
Syntactic and Semantic Features For Code-Switching Factored Language Models0
Syntactic Dependencies and Distributed Word Representations for Analogy Detection and Mining0
Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning0
Syntax Encoding with Application in Authorship Attribution0
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations0
SyntaxFest 2019 Invited talk - Quantitative Computational Syntax: dependencies, intervention effects and word embeddings0
Syntax Helps ELMo Understand Semantics: Is Syntax Still Relevant in a Deep Neural Architecture for SRL?0
Syntax Representation in Word Embeddings and Neural Networks -- A Survey0
Synthetic Data for English Lexical Normalization: How Close Can We Get to Manually Annotated Data?0
Synthetic, yet natural: Properties of WordNet random walk corpora and the impact of rare words on embedding performance0
Show:102550
← PrevPage 234 of 401Next →

No leaderboard results yet.