SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36513660 of 4002 papers

TitleStatusHype
Dirichlet-Smoothed Word Embeddings for Low-Resource Settings0
Disambiguated skip-gram model0
Discourse Relation Sense Classification Using Cross-argument Semantic Similarity Based on Word Embeddings0
Discovering Bilingual Lexicons in Polyglot Word Embeddings0
Discovering linguistic (ir)regularities in word embeddings through max-margin separating hyperplanes0
Discovering Stylistic Variations in Distributional Vector Space Models via Lexical Paraphrases0
Discretely Coding Semantic Rank Orders for Supervised Image Hashing0
Discrete Wavelet Transform for Efficient Word Embeddings and Sentence Encoding0
Discriminative Acoustic Word Embeddings: Recurrent Neural Network-Based Approaches0
Discriminative Pre-training for Low Resource Title Compression in Conversational Grocery0
Show:102550
← PrevPage 366 of 401Next →

No leaderboard results yet.