SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32213230 of 4002 papers

TitleStatusHype
Visually Aligned Word Embeddings for Improving Zero-shot Learning0
A Simple Language Model based on PMI Matrix Approximations0
Automated Detection of Non-Relevant Posts on the Russian Imageboard "2ch": Importance of the Choice of Word RepresentationsCode0
Rotations and Interpretability of Word Embeddings: the Case of the Russian Language0
Negative Sampling Improves Hypernymy Extraction Based on Projection LearningCode0
Fast Amortized Inference and Learning in Log-linear Models with Randomly Perturbed Nearest Neighbor Search0
Detecting Policy Preferences and Dynamics in the UN General Debate with Neural Word Embeddings0
Efficient Vector Representation for Documents through CorruptionCode0
Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection0
On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment AnalysisCode0
Show:102550
← PrevPage 323 of 401Next →

No leaderboard results yet.