SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33513360 of 4002 papers

TitleStatusHype
Efficient, Compositional, Order-sensitive n-gram EmbeddingsCode0
Applying Multi-Sense Embeddings for German Verbs to Determine Semantic Relatedness and to Detect Non-Literal Language0
Real-Time Keyword Extraction from Conversations0
Attention Modeling for Targeted Sentiment0
Measuring Topic Coherence through Optimal Word Buckets0
Building Web-Interfaces for Vector Semantic Models with the WebVectors Toolkit0
Learning User Embeddings from Emails0
Improving Neural Knowledge Base Completion with Cross-Lingual Projections0
Nonsymbolic Text Representation0
Explaining and Generalizing Skip-Gram through Exponential Family Principal Component Analysis0
Show:102550
← PrevPage 336 of 401Next →

No leaderboard results yet.