SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20512060 of 4002 papers

TitleStatusHype
Real-Time Keyword Extraction from Conversations0
Reasoning about Linguistic Regularities in Word Embeddings using Matrix Manifolds0
Recent Developments within BulTreeBank0
Recognition of Hyponymy and Meronymy Relations in Word Embeddings for Polish0
Recognition of Implicit Geographic Movement in Text0
Recognizing Humour using Word Associations and Humour Anchor Extraction0
Recognizing Plans by Learning Embeddings from Observed Action Distributions0
Recognizing Salient Entities in Shopping Queries0
Recognizing Textual Entailment in Twitter Using Word Embeddings0
Recognizing UMLS Semantic Types with Deep Learning0
Show:102550
← PrevPage 206 of 401Next →

No leaderboard results yet.