SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26312640 of 4002 papers

TitleStatusHype
WordNet EmbeddingsCode0
A Helping Hand: Transfer Learning for Deep Sentiment Analysis0
Searching for the X-Factor: Exploring Corpus Subjectivity for Word Embeddings0
Orthographic Features for Bilingual Lexicon Induction0
Named Entity Recognition With Parallel Recurrent Neural NetworksCode0
Unsupervised Learning of Distributional Relation Vectors0
Leveraging distributed representations and lexico-syntactic fixedness for token-level prediction of the idiomaticity of English verb-noun combinations0
A Multi-task Approach to Learning Multilingual Representations0
SuperNMT: Neural Machine Translation with Semantic Supersenses and Syntactic Supertags0
Illustrative Language Understanding: Large-Scale Visual Grounding with Image Search0
Show:102550
← PrevPage 264 of 401Next →

No leaderboard results yet.