SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26812690 of 4002 papers

TitleStatusHype
Visualizing Temporal Topic Embeddings with a Compass0
Visually Aligned Word Embeddings for Improving Zero-shot Learning0
Visually Grounded Word Embeddings and Richer Visual Features for Improving Multimodal Neural Machine Translation0
Visual Question Answering with Prior Class Semantics0
Visual Relationship Detection with Language Priors0
Learning Predicates as Functions to Enable Few-shot Scene Graph Prediction0
Visual-Semantic Embedding Model Informed by Structured Knowledge0
Visual Storytelling via Predicting Anchor Word Embeddings in the Stories0
Visual Summarization of Scholarly Videos using Word Embeddings and Keyphrase Extraction0
Visual-Textual Attentive Semantic Consistency for Medical Report Generation0
Show:102550
← PrevPage 269 of 401Next →

No leaderboard results yet.