SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32263250 of 4002 papers

TitleStatusHype
Fast Amortized Inference and Learning in Log-linear Models with Randomly Perturbed Nearest Neighbor Search0
Detecting Policy Preferences and Dynamics in the UN General Debate with Neural Word Embeddings0
Efficient Vector Representation for Documents through CorruptionCode0
Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection0
On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment AnalysisCode0
A Simple Approach to Learn Polysemous Word EmbeddingsCode0
Visually Grounded Word Embeddings and Richer Visual Features for Improving Multimodal Neural Machine Translation0
DAG-based Long Short-Term Memory for Neural Word Segmentation0
Multi-Attention Network for One Shot Learning0
Discretely Coding Semantic Rank Orders for Supervised Image Hashing0
Zara Returns: Improved Personality Induction and Adaptation by an Empathetic Virtual Agent0
Efficient Extraction of Pseudo-Parallel Sentences from Raw Monolingual Data Using Word Embeddings0
Bilingual Word Embeddings with Bucketed CNN for Parallel Sentence Extraction0
Improving Implicit Discourse Relation Recognition with Discourse-specific Word Embeddings0
Exploring Diachronic Lexical Semantics with JeSemECode0
Character-Aware Neural Morphological Disambiguation0
ESTEEM: A Novel Framework for Qualitatively Evaluating and Visualizing Spatiotemporal Embeddings in Social MediaCode0
Temporal Word Analogies: Identifying Lexical Replacement with Diachronic Word EmbeddingsCode0
Varying Linguistic Purposes of Emoji in (Twitter) Context0
Methodical Evaluation of Arabic Word Embeddings0
Obtaining referential word meanings from visual and distributional information: Experiments on object naming0
Neural Joint Model for Transition-based Chinese Syntactic Analysis0
Learning bilingual word embeddings with (almost) no bilingual data0
Information-Theory Interpretation of the Skip-Gram Negative-Sampling Objective Function0
Semantic Word Clusters Using Signed Spectral Clustering0
Show:102550
← PrevPage 130 of 161Next →

No leaderboard results yet.