SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31213130 of 4002 papers

TitleStatusHype
Simple and Effective Dimensionality Reduction for Word EmbeddingsCode0
Making Sense of Word EmbeddingsCode0
Identifying Reference Spans: Topic Modeling and Word Embeddings help IR0
Shortcut-Stacked Sentence Encoders for Multi-Domain InferenceCode0
A Syllable-based Technique for Word Embeddings of Korean Words0
Risk Bounds for Transferring Representations With and Without Fine-Tuning0
Zero-Inflated Exponential Family Embeddings0
Utterance Intent Classification of a Spoken Dialogue System with Efficiently Untied Recursive Autoencoders0
Detecting Anxiety through RedditCode0
Adapting Pre-trained Word Embeddings For Use In Medical Coding0
Show:102550
← PrevPage 313 of 401Next →

No leaderboard results yet.