SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32313240 of 4002 papers

TitleStatusHype
A Simple Approach to Learn Polysemous Word EmbeddingsCode0
Visually Grounded Word Embeddings and Richer Visual Features for Improving Multimodal Neural Machine Translation0
DAG-based Long Short-Term Memory for Neural Word Segmentation0
Multi-Attention Network for One Shot Learning0
Discretely Coding Semantic Rank Orders for Supervised Image Hashing0
Zara Returns: Improved Personality Induction and Adaptation by an Empathetic Virtual Agent0
Efficient Extraction of Pseudo-Parallel Sentences from Raw Monolingual Data Using Word Embeddings0
Bilingual Word Embeddings with Bucketed CNN for Parallel Sentence Extraction0
Improving Implicit Discourse Relation Recognition with Discourse-specific Word Embeddings0
Exploring Diachronic Lexical Semantics with JeSemECode0
Show:102550
← PrevPage 324 of 401Next →

No leaderboard results yet.